ASCII Debuts – Archived Post

The original source for this post is gone/ broken. I could not find the post elsewhere. Keeping as an archive here so the information is not lost.

1963: ASCII Debuts – By Mary Brandel – 04/12/99

If it weren’t for a particular development in 1963, we wouldn’t have e-mail and there would be no World Wide Web. Cursor movement, laser printers and video games — all of these owe a big debt of gratitude to this technological breakthrough.

What is it? Something most of us take for granted today: ASCII. Yep, plain old ASCII, that simplest of text formats.

To understand why ASCII (pronounced AS-KEE) is such a big deal, you have to realize that before it, different computers had no way to communicate with one another. Each manufacturer had its own way of representing letters in the alphabet, numbers and control codes. “We had over 60 different ways to represent characters in computers. It was a real Tower of Babel,” says Bob Bemer, who was instrumental in ASCII’s development and is widely known as “the father of ASCII.”

ASCII, which stands for American Standard Code for Information Interchange, functions as a common denominator between computers that otherwise have nothing in common. It works by assigning standard numeric values to letters, numbers, punctuation marks and other characters such as control codes. An uppercase “A,” for example, is represented by the number 65.

All the characters used in e-mail messages are ASCII characters, as are the characters in HTML documents.

But in 1960, there was no such standardization. IBM’s equipment alone used nine different character sets. “They were starting to talk about families of computers, which need to communicate. I said, ‘Hey, you can’t even talk to each other, let alone the outside world,'” says Bemer, who worked at IBM from 1956 to 1962.

Midway through Bemer’s IBM career, this heterogeneity became a real concern. So in May 1961, Bemer submitted a proposal for a common computer code to the American National Standards Institute (ANSI). The X3.4 Committee — representing most computer manufacturers of the day and chaired by John Auwaerter, vice president of the former Teletype Corp. — was established and got right to work.

It took the ANSI committee more than two years to agree on a common code. Part of the lengthy debate was caused by self-interest. The committee had to decide whose proprietary characters were represented. “It got down to nitpicking,” Bemer says. “But finally, Auwaerter and I shook hands outside of the meeting room and said, ‘This is it.'” Ironically, the end result bore a strong resemblance to Bemer’s original plan.

If you were to jump ahead to this year, you’d think it was smooth sailing after that. Today, ASCII is used in billions of dollars’ worth of computer equipment as well as most operating systems — the exception being Windows NT, which uses the newer Unicode standard, which is only somewhat compatible with ASCII.

However, there was an 18-year gap between the completion of ASCII in 1963 and its common acceptance. This has everything to do with IBM and its System/360, which was released in 1964. While ASCII was being developed, everyone — even IBM — assumed the company would move to the new standard. Until then, IBM used EBCDIC, an extension of the old punch-card code.

But just as ASCII became a done deal and the System/360 was ready for release, Dr. Frederick Brooks, head of IBM’s OS/360 development team, told Bemer the punch cards and printers wouldn’t be ready for ASCII on time. IBM tried to develop a way for the System/360 to switch between ASCII and EBCDIC, but the technique didn’t work.

Until 1981, when IBM finally used ASCII in its first PC, the only ASCII computer was the Univac 1050, released in 1964 (although Teletype immediately made all of its new typewriter-like machines work in ASCII). But from that point on, ASCII became the standard for computer communication.

The story of ASCII wouldn’t be complete without mentioning the “escape” sequence. According to Bemer, it’s the most important piece of the ASCII puzzle. Early in the game, ANSI recognized that 128 characters were insufficient to accommodate a worldwide communication system. But the seven-bit limitation of the hardware at the time forbade them to go beyond that.

So Bemer developed the escape sequence, which allows the computer to break from one alphabet and enter another. Since 1963, more than 150 “extra-ASCII” alphabets have been defined.

Along with Cobol, ASCII is one of the few basic computer technologies from the 1960s that still thrives today.

Original Source (link is broken): AgentIT

The Demise of the $.01 Sign

Originally posted by Charlie Anderson. The site seems to be hanging on but has not been updated since 2003. I liked finding a post about an ASCII art character (more history to it than that of course). We don’t think about the individual keyboard characters very often, but they have been around for centuries, long before they were placed on a keyboard.

When I was a boy, not so long ago, there was a thing called the cent sign. It looked like this: ¢

It was the dollar sign’s little brother, and lived on comic books covers and in newspaper advertisements and on pay phones and wherever anything was being sold for less than a buck. It was a popular punctuation symbol—no question mark, or dollar sign, certainly, but just behind the * in popularity, and I daresay well ahead of #, &, and the now Internet-hot @. It owned an unshifted spot on the typewriter keyboard, just to the right of the semicolon, and was part of every third grader’s working knowledge.

In the late 1990s, you don’t see many cent signs. Why? Because hardly anything costs less than a dollar anymore? Actually, the demise of the cent sign has little to do with inflation, and everything to do with computers. And therein lies a tale.

In the 1960s a disparate group of American computer manufacturers (basically, everyone but IBM) got together and agreed on an encoding standard that became known as ASCII (“ass-key”—The American Standard Code for Information Interchange). This standard simply assigned a number to each of the various symbols used in written communication (e.g., A-Z, a-z, 0-9, period, comma). A standard made it possible for a Fortran program written for a Univac machine to make sense to a programmer (and a Fortran compiler) on a Control Data computer. And for a Teletype terminal to work with a Digital computer, and so on.

So-called text files, still in widespread use today, consist of sequences of these numbers (or codes) to represent letters, spaces, and end-of-lines. Text editors, for example, the Windows Notepad application, display ASCII codes as lines of text on your screen so that you can read and edit them. Similarly, an ASCII keyboard spits out the value 65 when you type a capital ‘A,’ 65 being the ASCII code for ‘A.’

The committee decided on a seven bit code; this allowed for twice as many characters as existing six bit standards, and permitted a parity bit on eight bit tape. So there were 128 slots to dole out, and given the various non-typographic computing agendas to attend to, it was inevitable that some common symbols, including several that had always been on typewriter keyboards, wouldn’t make the cut. (The typewriter layout had certain obvious failings in computer applications, for example: overloading the digit 1 and lower case L, so it couldn’t be blindly adopted.)

Three handy fractions were cut: ¼ ½ ¾. This makes sense, especially when you consider that the ASCII committee was composed of engineers. I’m sure they thought, in their engineer’s way, “Why have ¼ but not 1/3? And if we have 1/3, then why not 1/5? Or 3/32?” Similarly, the committee apparently found $0.19 an acceptable, if somewhat obtuse, way of expressing the price of a Bic pen. At any rate, the popular and useful cent sign didn’t make it.

And so the cent sign was off keyboards, terminals, and printers. Not that many people noticed right away. The companies behind ASCII sold big, expensive computers that were used to run businesses, and few cared that there wasn’t a cent sign character on one’s new line printer. Heck, if your printer could handle lower-case letters, you were state of the art.

But when personal computers began to appear in the late 1970s, the primary application driving their adoption was word processing. These new small computers used the ASCII standard—after all, that’s what standards are for. By the millions, typewriter keyboards (with ¢) were traded in for Apple IIs and IBM PCs (without ¢). While it’s true that the cent sign was ultimately made part of other larger encoding standards, and is possible to create at modern PCs with a little effort—the damage had been done. Without a cent key in front of them, writers of books, newspapers, magazines, and advertisements made do without. And over time, $0.19 began to look like the right way to say 19¢. In another few years the cent sign will look as alien as those strange S’s our forefathers were using when they wrote the constitution.

Original Source (site is untouched since 2003): The Demise of the $.01 Sign