Monday, February 17, 2014

The Story of Computer



Since ancient times, people have had ways to deal with data and numbers. Computing hardware has evolved from simple devices to aid calculation, to mechanical calculators, punched card data processing and then to modern stored-program computers.Devices have been used to aid computation for thousands of years, mostly using one-to-one correspondence with fingers. The earliest counting device was probably a form of tally stick. The abacus was early used for arithmetic tasks. What we now call the Roman abacus was used in Babylonia as early as 2400 BC. Since then, many other forms of reckoning boards or tables have been invented.

As trade and tax system grew in complexity, people saw that faster, more reliable and exact tools were needed for doing math and keeping records.

Scottish mathematician and physicist John Napier discovered that the multiplication and division of numbers could be performed by the addition and subtraction, respectively, of the logarithms of those numbers. While producing the first logarithmic tables, Napier needed to perform many tedious multiplications. It was at this point that he designed his 'Napier's bones', an abacus-like device that greatly simplified calculations that involved multiplication and division.

Mechanical calculators - In the mid-1600's, Blaise Pascal and his father, who was a tax officer himself, were working on  taxes for the French government in Paris.  The two spent hours figuring and re-figuring taxes that each citizen owed.  Young Blaise decided in 1642 to build an adding and subtraction machine that could aide in such a tedious and time consuming process.  Blaise Pascal started some pioneering work on calculating machines and after three years of effort and 50 prototypes he invented the mechanical calculator. He built twenty of these machines (called Pascal's Calculator or Pascaline) in the following ten years. The machine Blaise made had a set of eight gears that worked together much like an odometer keeps track of a car's mileage.  His machine encountered many of problems.  For one, it was always breaking down.  Second, the machine was slow and extremely costly.  And third, people were afraid to use the machine thinking it might replace their jobs.  Pascal later became famous for math and philosophy, but he is still remember for his role in computer technology.  In his honor, there is a computer language named Pascal.

Around 1820, Charles Xavier Thomas de Colmar created the first successful, mass-produced mechanical calculator, the Thomas Arithmometer, that could add, subtract, multiply, and divide.It was mainly based on Leibniz' work. Mechanical calculators remained in use until the 1970s.

In 1801, Joseph-Marie Jacquard developed a loom in which the pattern being woven was controlled by punched cards.

The next big step for computers arrived in the 1830's when Charles Babbage decided to build a machine to help him complete and print mathematical tables.  Babbage was a mathematician who taught at Cambridge University in England.  He began planning his calculating machine calling it the Analytical Engine.  The idea for this machine was amazingly like the computer we know today.  It was to read a program from punched cards, figure and store the answers to different problems, and print the answer on paper.  Babbage died before he could complete the machine.  However because of his remarkable ideas and work, Babbage is know as the Father of Computers.

The next huge step for computers came when Herman Hollerith entered a contest given by the U.S. Census Bureau.  The contest was to see who could build a machine that would count and record information faster.  Hollerith, a young man working for the Bureau built a machine called the Tabulating Machine that read and sorted data from punched cards.  The holes punched in the cards matched each person's answers to questions.  For example, married, single, and divorces were answers on the cards.In the late 1880s, Herman Hollerith invented data storage on punched cards that could then be read by a machine.   The Tabulator read the punched cards as they passed over tiny brushes.  Each time a brush found a hole, it completed an electrical circuit.  This caused special counting dials to increase the data for that answer.

Thanks to Hollerith's machine, instead of taking seven and a half years to count the census information it only took three years, even with 13 million more people since the last census.  Happy with his success, Hollerith formed the Tabulating Machine Company in 1896.  The company later was sold in 1911.  And in 1912 his company became the International Business Machines Corporation, better know today as IBM.

The era of modern computing began with a flurry of development before and during World War II. Most digital computers built in this period were electromechanical - electric switches drove mechanical relays to perform the calculation. These devices had a low operating speed and were eventually superseded by much faster all-electric computers, originally using vacuum tubes.

In 1936 British mathematician Alan Turing proposed the idea of a machine that could process equations without human direction. The machine (now known as a Turing machine) resembled an automatic typewriter that used symbols for math and logic instead of letters. Turing's machine was the theoretical precursor to the modern digital computer.

The first all electronic computer was the ENIAC (Electronic Numerical Integrator and Computer).  ENIAC was a general purpose digital computer built in 1946 by J. Presper Eckert and John Mauchly.  The ENIAC contained over 18,000 vacuum tubes. In twenty seconds, ENIAC could do a math problem that would have taken 40 hours for one person to finish. The ENIAC was built the time of World War II had as its first job to calculate the feasibility of a design for the hydrogen bomb.The ENIAC was 100 feet long and 10 feet tall.Many of ENIAC's first tasks were for military purposes, such as calculating ballistic firing tables and designing atomic weapons. Since ENIAC was initially not a stored program machine, it had to be reprogrammed for each task.

A more modern type computer began with John von Neumann's development of software written in binary code.  It was von Neumann who began the practice of storing data and instructions in binary code and initiated the use of memory to store data, as well as programs.  A computer called the EDVAC (Electronic Discrete Variable Computer) was built using binary code in 1950.  Before the EDVAC, computers like the ENIAC could do only one task then they had to be rewired to perform a different task or program.  The EDVAC's concept of storing different programs on punched cards instead of rewiring computers led to the computers that we know today.

Transistor computers - The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space.

The next great advance in computing power came with the advent of the integrated circuit (the Chip). The first practical ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor.Kilby described his new device as “a body of semiconductor material wherein all the components of the electronic circuit are completely integrated.”

Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.His chip solved many practical problems that Kilby's had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium.

This new development heralded an explosion in the commercial and personal use of computers and led to the invention of the microprocessor. While the subject of exactly which device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel 4004, designed and realized by Ted Hoff, Federico Faggin, and Stanley Mazor at Intel.

The progression in hardware representation of a bit of data:
  • Vacuum Tubes (1950s) - one bit on the size of a thumb
  • Transistors (1950s and 1960s) - one bit on the size of a fingernail
  • Integrated Circuits (1960s and 70s) - thousands of bits on the size of a hand
  • Silicon computer chips (1970s and on) - millions of bits on the size of a finger nail.
Note - The first successful high level programming language - FORTRAN Computer Programming Language - John Backus & IBM (1954)