In the last few years, computers have emerged as one of the most useful tools of our time. Because of the computer's ability to process information and perform complex mathematical operations quickly, computers have appeared nearly everywhere in our society, from business to science. Man has found so many uses for this tool that it will soon dominate our lifestyle and continue to dominate in the future.
Computers were not always so powerful. Like all things, they started small. However, it is hard to imagine that these complex machines had such simple beginnings as the abacus and slide rule. This paper will trace the evolution of the modern computer from its humble roots to today.
Since the dawn of humanity, people have been looking for ways to make their work easier. When numbers were invented in order to make managing records easier, man immediately looked for easier ways to manage them. The first calculating device, the abacus, was invented for that purpose. It consisted of a set of beads strung into rows on a wooden frame. The Arabic number system made paper calculations easier. Still, the job was tedious and left lots of room for error. Another way for managing numbers had to be found.
In 1642, a 19-year-old French mathematician named Blaise Pascal invented what is considered to be the world's first automated calculating device. It was called the Pascaline. He invented the machine to help his father in his job as a tax collector. Irving E. Fang, in his book The Computer Story, gives a description of how Pascal's device worked. "Numbers were entered by dialing them in. A wheel moved through ten marked positions, 0-9, before nudging the wheel next to it from 0 to 1 and, at the same time, returning to 0. The position of the wheels determined the sum in addition and the difference in subtraction..."(8). The same principle is still used today in devices such as odometers.
Pascal thought his Pascaline would be a commercial success. However, the machine had problems that prevented him from selling more than a few. The first problem was that it was not easy to use. Even "...the simple operation of subtraction required interpreting results on the machine instead of actually reading results..."(Blissmer, p. 353). It was also expensive. There was also the problem of resistance to change. However, Pascal's idea was not abandoned forever.
German mathematician named Gottfried von Leibniz attempted to improve Pascal's machine, in the process creating his own calculating device, the Stepped Reckoner, in 1673. His device contained cylinders--later called "Leibniz wheels"--with stepped teeth that ran different lengthsn around them. It could add, subtract, multiply, and divide, an improvement on the Pascaline, which could only add and subtract. Machines using Leibniz's principle were used even up to the twentieth century.
There were also ways to make paper calculations easier. John Napier, a Scottish baron, developed logarithms, "...numbers that enable multiplication and division to be reduced to addition and subtraction"(Blissmer, p.352). He also created "Napier's bones", sticks of bone or ivory with numbers etched into them which could be used as an "aid to multiplication and division"(Litterick, p.15). "Napier's bones" were the precursor to the slide rule, a set of ruled and numbered sticks that were used by mathematicians, scientists and engineers until the 1970's.
Logarithms and slide rules had one weakness, though. They were only as accurate as the people using them, and sometimes there was no margin for error. Fang states, "...ships ran into rocks because numbers printed in navigation tables were figured badly or printed in error..."(12). That problem led one person to create a device that would become the precursor of modern computers.
Around the year 1830, Charles Babbage, an English mathematician, designed a machine that would accurately calculate tables. It was called a "Difference Engine". However, he was not able to get the machine to work. He then conceived of a more advanced machine, which he called the "Analytical Engine". According to his design, it would be as large as a house, a brass and pewter "mechanical monster of rods, gears, and wheels" powered by six steam engines that could solve any arithmetic problem(Fang, p.15). According to The Story of Computers by Ian Litterick, "...it would be flexible and it would be programmable". It had "all the essential parts of a modern computer--input; a central processing unit, which he called "the mill"; storage for intermediate results, which we would call "memory"; a control unit to make the machine perform its calculations in the right order; and output in the form of paper..."(18). That is why he is known as the "Father of the Computer". Even though he could not build the machine, he came up with the characteristics that would identify computers in the modern age.
Babbage was aided in his endeavors by a friend and colleague, Augusta Ada Lovelace. She has been called the first computer programmer. She "helped develop the instructions for doing computations on the analytical engine"(Capron, p.549). She also "published a series of notes that eventually led others to accomplish what Babbage...had been unable to do."
The next big leap in the evolution of computers occured in the United States. Herman Hollerith, a young American statistician, developed a machine to speed the counting of the 1890 Census. The machine was partly mechanical and partly electronic. He used an idea that worked in the weaving industry. Punched cards were used to store and input data, which could be read quickly using the "new" power of electricity. With Hollerith's system, the Census Bureau produced a national total in only six weeks.
Hollerith was an instant success. "Orders poured into Hollerith's company from census bureaus around the world and from large private businesses which wanted machines for accounting and inventory...Hollerith's company merged with others...eventually [becoming] International Business Machines, IBM"(Fang, p.24).
During World War II, computer technology advanced further. An electromechanical machine named the Mark I was among the most famous of the computers built during that time. It was built by IBM for the U.S. Navy. It used electric relays to transmit data and instructions. A special-purpose computer named Colossus was used by the British in order to break German radio codes. Also, at Iowa State University, the ABC computer, developed by John V. Atanasoff and Clifford Berry, was built as the first all-electronic digital computer.
The creation of the ENIAC in 1946 was the next big step in computer technology. ENIAC, short for Electronic Numerical Integrator and Calculator, was the first completely electronic general-purpose computing machine. It could calculate complex mathematical problems a thousand times faster than any previous device. It also covered an area of 15,000 square feet, weighed 30 tons, and used approximately 19,000 vacuum tubes. These tubes were used to store and convey data much like punched cards and electric relays.
The ENIAC was not without its share of problems. It was hard to reprogram. "For any new program ENIAC had to have hundreds of cables replugged and hundreds of switches reset before its...calculations. Sometimes scientists took days to prepare ENIAC for a computation it completed in seconds"(Fang, p.43). Also, the vacuum tubes inside ENIAC required lots of energy to run and broke down constantly. The scientists would have to replace each one as it burned out.
The scientists who built ENIAC recognized these problems even before ENIAC was completed. They then set about building a new computer to correct the problems. The ENIAC team, aided by John von Neumann, a Hungarian-American mathematician, designed their Electronic Discrete Variable Automatic Computer (EDVAC) with the capability to store several programs at once, easing the programming burden somewhat. This solved the first problem.
In 1947, two scientists who worked on the ENIAC, J. Presper Eckert and John W. Mauchly, formed their own company, the Eckert Mauchly Computer Corporation. They built the first computer to be sold commercially. It was called the UNIVersal Automatic Computer, or UNIVAC for short. It used the stored program concept used in the EDVAC design and could be used for "...alphabetic as well as numeric uses"(TUTOR.COM). UNIVAC was a commercial success. In Computers: Mechanical Minds by Don Nardo, it states, "Large business firms, government agencies, and universities bought models of the UNIVAC. Each group used the computer to cut down the time and costs of processing huge amounts of information"(20). A UNIVAC working for CBS even helped predict the results of the 1952 presidential election. The UNIVACs performed their intended tasks well until replaced by more advanced devices. The UNIVAC had essentially the same problem with its vacuum tubes as did ENIAC. The excessive financial, time, effort, and energy costs used by the computers "gave scientists a strong incentive to find a more reliable replacement"(Nardo, p.22). A replacement was found in 1948.
Working at the Bell Labs in New Jersey, three scientists, William Shockley, Walter Brattain, and John Bardeen, experimented with silicon to create what they called a "transfer resistor", or transistor for short. It was "a little switch [that could] control the passage of electricity"(Litterick, p.30). The transistor worked just like a vacuum tube, only it was more reliable, took up a thirtieth of the space, used a twentienth of the electricity, generated a fiftieth of the heat, and cost less to make. Computer makers and computer buyers understandably changed to the new technology.
No outstanding computers were made during this period, but it was remarkable for two reasons. One, "...it was famous for development of higher order [computer] languages. Computers could now be programmed with English- like commands instead of strings of numbers. Programming efficiency improved greatly"(TUTOR.COM).
The second reason was that it paved the way for another "computer revolution". In 1958, Jack S. Kilby, working for Texas Instruments, conceived the idea of manufacturing several transistors together in a small package and in doing so created the Integrated Circuit (IC), also called a chip. Once manufacturers had chip-making technology, chips could be made for a very low cost. Chips were smaller than separate transistors, so they can be put in more and more places. They were used to control and navigate American spacecraft. Chip technology was also responsible for getting computers into "...medium-size and smaller businesses and government operations where they had not been used before"(Capron, p.554). In order to do this, computer programs and systems had to become more sophisticated in order to fit the needs of users. A new class of computer, the minicomputer, was unveiled, being smaller and cheaper than the previous mainframe computer. The concept of time sharing was developed so that many users could use a computer at one time. "Instead of...processing decks of cards into a punched card reader hooked to a mainframe computer, users were given terminals...[which] consisted of a cathode ray tube, like a television screen, and a keyboard. A large computer could easily talk to a dozen or more terminals. Its split-second reaction could convince each user that he or she alone was communicating with the computer"(Fang, pp. 53-54). Computer users benefitted from the ability for a computer to be several places at once.
The next generation of computers were even smaller, more powerful, and used by more people. It was started in 1971, when Ted Hoff of Intel Corporation "...decided that it might be possible to place all the arithmetic and logic circuitry on one chip. When combined with the 4001 ROM [Read Only Memory] chip and the 4002 RAM [Random Access Memory] chip, it made a microcomputing system"(Blissmer, p.372). This began the age of the microprocessor. Its power was soon found. "The...4004 [microprocessor chip] had 2250 transistors on a 1/6-inch long by 1/8-inch wide chip" and "almost matched ENIAC's 1946 total computational power."
In this age, more and more uses were found for computers. Hand-held pocket calculators and digital watches were widely accepted. However, this generation of computers were brought even farther, into the average home. The first really successful personal computer hit the market in 1975. It was developed by a company called MITS (Micro Instrumentation and Telemetry) and concieved by its president, Ed Roberts. Called the MITS Altair, it was not like today's personal computers. "In fact, it met the definition of a computer in only a minimal way: it had a central processing unit (on the chip), 256 characters (a paragraph!) of memory, and switches and lights on a front panel for input/output. No screen, no keyboard, no storage"(Capron, p.547). However, the machine was an instant success. "...2,000 orders for the computer came flooding in when MITS had only one prototypes...they managed to make more, and had sold 8,000 computers by the end of 1975"(Litterick, p.37).
Another development in the history of computers occured when two teenagers, Steve Jobs and Steve Wozniak, built the first Apple computer in their garage in 1976. They were able to "...put together a complete computer: a keyboard for input, a few thousand characters of memory, and the means to connect it up to a screen to see the results"(Litterick, p.38). Their Apple I was more advanced and easier to use than the Altair, and they sold 200 of them easily. In 1977 they designed and built the Apple II "at a price which brought it within the range of people who could afford a fairly expensive hi- fi set...it appealed to a whole new range of people who did now know about electronics and computers"(38).
This opened a whole new market for computer manufacturers to exploit. Schools, small businesses, and average homes were starting to use the new "microcomputers". Some people who worked on the original mainframes and minicomputers criticized the new "micros", calling them toys, but in 1981 IBM, computer giant for several decades, introduced their IBM Personal Computer (PC). The PC, as powerful as a minicomputer of ten years earlier, hit businesses like a storm. This big splash in the pond proved microcomputers were here to stay.
We have gone through many eras of computer history. Each one rapidly revolutionized the entire computer industry. Computers have shrunken down and increased in power immensely during each era. Nowadays, a single microchip the size of a fingertip has far more power than the thirty-ton ENIAC. Because of their small size and many uses, computers have appeared in the home, not only in PCs, but in video games, TV sets, VCRs, radios, microwaves, telephones, and almost every other home appliance. In business, computers have made things much easier. Computer-controlled robots are relieving many industrial jobs. Computers are also in many, many more places, with more added each day. It would seem that we have developed the ultimate tool for mankind. That is, until something better comes along.
Blissmer, Robert H. Introducing Computers: Concepts, Systems, and Applications. 1992-93 ed. New York: John Wiley and Sons, 1992.
Litterick, Ian. The Story of Computers. New York: Bookwright Press, 1984.
Fang, Irving E. The Computer Story. St. Paul: Rada Press, 1988.
Capron, H. L. and John D. Perron. Computers and Information Systems: Tools for an Information Age. 3rd ed. Redwood City: Benjamin/Cummings Publishing Company, 1993.
Elliot, Sharon. Living With the Microchip. New York: Bookwright Press, 1985.
Nardo, Don. Computers: Mechanical Minds. San Diego: Lucent Books, 1990.
Rand McNally and Company. Encyclopedia of Computers and Electronics. Chicago: Rand McNally and Company, 1985.
Berger, Melvin. Computers in Your Life. New York: Thomas Y. Crowell, 1981.
Computer Knowledge. TUTOR.COM: Tutorials About PC Computing. Vers. 4.2. Computer software. Computer Knowledge, 1985.
Copyright 1998 By Jack Mileur. All rights reserved.