Friday, December 18, 2009

Computers in History



[ Team LiB ]









Computers in History


Strictly speaking, a computer is something that computes, which is not a particularly informative definition. In the vagueness of the term, however, you'll find an interesting bit of history. The word computer does not necessarily mean an electronic machine or a machine at all. If you were a researcher a hundred years ago and you wanted to take a break from heavy-duty math work, such as creating a tide table, you might have taken your computer out to lunch. Scientists engaged in difficult mathematics often employed a bevy of computers梞en and women with pencils, papers, and green eye-shades who computed the numbers they needed.


Up until the end of World War II, a computer was a person who computed. She might use a pencil (a pen if she were particularly confident of her results), a slide rule, or even a mechanical calculator. Poke a few numbers in, pull a crank, and the calculator machine printed an answer in purple ink on paper tape梐t least if the questions involved simple arithmetic, such as addition. If this person did a lot of calculations, the black ink of the numbers soon faded to a pale gray, and he grew calluses on his fingertips and cranking hand.


The early machines for mathematics were once all known as calculators, no matter how elaborate梐nd they could be quite elaborate. Charles Babbage, a 19th-century English country gentleman with a bold idea and too much time on his hands, conceived the idea of a machine that would replace the human computers used to calculate values in navigational tables. Babbage foresaw his mechanical computer-replacement as having three advantages over number-crunchers who wielded pencil and paper: The machine would eliminate mistakes, it would be faster, and it would be cheaper. He was right about all but the last, and for that reason he never saw the most intricate machines he designed actually built. Moreover, he never called his unbuilt machines "computers." His names for them were the Difference Engine and the Analytical Engine. Even though Babbage's machines are considered the forerunners of today's computers梥ometimes even considered the first computers by people who believe they know such things梩hey really weren't known as "computers" in Babbage's time. The word was still reserved for the humans who actually did the work.


The word computer was first applied to machines after electricity replaced blood as the working medium inside them. In the early part of the 20th century, researchers struggled with the same sort of problems as those in Babbage's time, and they solved them the same way. In the 10 years from 1937 to 1947, scientists created the first devices that are classed as true computers, starting with an electrically powered mechanical machine and ending with an all-electronic device powered by an immense number of vacuum tubes, which required an equally immense amount of good fortune for them to all work long enough to carry out a calculation. Nobody called them computers just yet, however.


The first of these machines梐 mechanical computer of which Babbage would have been proud梬as the IBM-financed Automatic Sequence Controlled Calculator, which is often called Harvard Mark I. The five-ton design included 750,000 parts, including switches, relays, and rotating shafts and clutches. It stretched out for 50 feet and was eight feet tall. It sounded, according to an observer of the time, like a roomful of ladies knitting.


Many of the fundamentals of today's computers first took form in the partly electronic, partly mechanical machine devised by John Vincent Atanasoff at Iowa State College (now University). His ideas and a prototype built with the aid of graduate student Clifford Berry have become a legend known as the Atanasoff Berry Computer (with the acronym ABC), the first electronic digital computer梐lthough it was never contemporaneously called a "computer." Iowa State called the device "the world's fastest calculator" as late as 1942.


In Britain, crypto-analysts developed a vacuum-tube (valve in Britain) device they called Colossus that some people now call the first electronic computer梪sually British folk who don't want you to forget that the English can be clever, too. But the rest of the world never called Colossus a computer梠r anything else梑ecause it was top secret until the end of the century.


The present usage of the word computer goes back only to June 5, 1943, when ENIAC (the most complex vacuum tube-based device ever made) was first proposed as a collaboration between the United States Army and the University of Pennsylvania. The original agreement on that date first used the description that became its name, as well as the name for all subsequent machines: the Electronic Numerical Integrator and Computer.


Three years and $486,804.22 later, the machine made its first computation at the university. The 30-ton behemoth, and its offspring, captured the imagination of the world, and the term computer shifted from flesh-and-blood human calculators to machines. In Hollywood, such thinking machines grew even bigger and took over the world, at least in 1950s science fiction movies. In business, ENIAC's offspring, the Univac, took over billing for utilities and gave a new name to bookkeeping foul-ups and bureaucratic incompetence: computer error. Also, scientists tried to figure out how to squeeze a room-sized computer into a space capsule, into which they could barely shoehorn a space-suited human being.


The scientists pretty much figured things out梩hey created the microprocessor, which led to the age of microcircuits梑ut not until after a few scientific diversions, including sending men to the moon. Oddly enough, although modern microelectronic circuitry is credited as an offshoot of the space program (shrinking things down and making them lighter was important to an industry in which lifting each ounce cost thousands of dollars), in the history of technology, the moon landing (1969) comes two years before the invention of the microprocessor (1971).


Once the microprocessor hit, however, tinkerers figured how to make small computers cheap enough that everyone could afford one. Computers became personal.






    [ Team LiB ]



    No comments: