Wednesday 10 February 2016

The History of Computers

The History of Computers

The History of Computers


Early Counting Tools

A computer is a machine that works with data and information in the form of numbers. People from the beginning of time, and throughout the years, have invented and continue to invent things that help them count.
Caveman counted with the only counting tools they knew, their fingers and toes. These were considered the first counting tools. Soon man realized that other objects needed to be used to keep up with larger numbers. Some of the other counting tools that have been used throughout time have been stones, knots on ropes (quipu), and notches on sticks, notches on bones, tally sticks, to name just a few. People used these counting tools to count their possessions and also to keep tract of the passing of time.

Counting

The oldest known objects used to represent numbers are bones with notches carved into them.
These bones, which were discovered in western Europe, date from the Aurignacian period 20,000 to 30,000 years ago and correspond to the first appearance of Cro-Magno man.

Of special interest is a wolf's jawbone more than 20,000 years old with fifty-five notches in groups of five. This bone, which was discovered in Czechoslovakia in 1937, is the first evidence of the tally system.
Another form of manual counting used was knotted strings, sometimes known as the quipu. In their simplest form, knotted number strings are much the same as the simple tally sticks. Counting with the use of knotted strings has been found all over the world. The Inca Indians in particular were known for using the quipu.





It is important to distinguish the early abacuses (or abaci) known as counting boards from the modern abaci. The counting board is a piece of wood, stone or metal with carved grooves or painted lines between which beads, pebbles or metal discs were moved. The abacus is a device, usually of wood (plastic, in recent times), having a frame that holds rods with freely-sliding beads mounted on them.
Both the abacus and the counting board are mechanical aids used for counting; they are not calculators in the sense we use the word today. The person operating the abacus performs calculations in their head and uses the abacus as a physical aid to keep track of the sums, the carrys, etc.
Counting boards and counting tablets were also used to represent everyday calculations such as goods bought and sold.

The oldest surviving counting board is the Salamis tablet (originally thought to be a gaming board), used by the Babylonians circa 300 B.C., discovered in 1846 on the island of Salamis.


Abacus

Approximately 4,000 years ago, the Chinese invented the Abacus. It was the first machine used for counting and calculating. It is made of a wooden frame, metal rods, and wooden beads. It takes a great deal of time and practice to learn how to master the use of an abacus. An abacus is a person who is very experienced in using an abacus. Today, the abacus is still used widely in China and other Asian countries to count and calculate, just as we use calculators.
abacus-photo.gif.png
Each bead has a specific value. Reading from right to left, the beads in the first column are worth 1, in the second column the beads are worth 10, in the third column the beads are worth 100, etc. Addition, subtraction, multiplication, and division are performed by moving the appropriate beads to the middle of the abacus.


John Napier & Napier's Bones

In the early 17th century, John Napier, a Scottish mathematician, invented another calculating tool. It used marked strips of wood or bone, side by side, to multiply and divide. This tool became known as "Napier's Bones."



Gear-Driven Machines

1642-Biaise Pascal & The Pascaline

In 1642, at the age of 19, a French mathematician by the name of Biaise Pascal, invented the Pascaline. the Pascaline is known as the first mechanical and automatic calculator. pascal invented the Pascaline to help make his father's job as a tax accountant easier. The machine is sometimes called the La Pascaline or Pascal's machine. The Pascaline never became popular. First of all, the machine broke often and its inventor was the only person who could fix it. Second, it was slow. Third, clerks would not use it: They were afraid it might replace them at their jobs.
Pascal later became famous in math and philosophy, but he is still remembered for his role in computer history. In his honor, there is a computer language named Pascal.

 

The Pascaline

The Pascaline was a wooden box that could only add and subtract by means of a series of gears and wheels. When each wheel rotated one revolution, it would then turn the neighboring wheel. On top of the wheels were a series of windows through which the totals could be read. About 50 models were constructed and were made of wood, ivory, ebony, and copper.







1673-Wilhelm Gottfried Liebniz and The Stepped Reckoner

In 1673, German inventor Gottfried Liebniz perfected the Liebniz Calculator. Liebniz entered a university at fifteen years of age and received his bachelor's degree at seventeen. This machine is sometimes called The Stepped Reckoner. The Liebniz was also a calculating machine, but much superior to that of the Pascaline. It could do more than just add and subtract. The Liebniz Calculator could also multiply, divide, and find square roots of numbers. It too was mechanical and worked by hand. A crank was added to speed up the work of this calculator. It was used by mathematicians and bookkeepers.

 

Liebniz's Calculator

Mr. Liebniz believed that it did not make sense fo men to spend hours and hours doing mathematical calculations when he could invent a machine that would work much faster. Would you rather add a long list of numbers with a pencil and paper or use a calculator?

 

1801-Joseph-Marie Jacquard & the Jacquard's Loom

Jacquard's Loom

In 1801, Jacquard invented the Jacquard loom. It was a weaving machine that was controlled by punched cards. While the loom was being pumped, cards with holes in them were attached together in a pattern through which strings of thread were automatically fed. These cards would feed the right pieces of thread into the loom to make a beautiful cloth.

His invention scared other weavers because it made cloth faster and better than they could by hand. As a result, Jacquard's house and loom were burned down.
This violent act did not discourage Jacquard, for he built another loom. Weavers today still use the Jacquard Loom.

In the years to follow, variations on jacquard's punched cards would find a variety of uses, including representing the music to be played by automated pianos and the storing of programs for computers.


Charles Babbage & his Engines

In the early 1820s, an English mathematician by the name charles Babbage, designed a computing machine called the Difference Engine. This machine was to be used in the calculating and printing of simple math tables. In the 1830s, he designed a second computing machine called the Analytical Engine. This machine was to be used in calculating complicated problems by following a set of instructions.

Analytical Engine

The Analytical Engine was a mechanical computer that can solve any mathematical problem. It uses punch-cards similar to those used by the Jacquard loom and can perform simple conditional operations.

Difference Engine

However, neither of these machines were ever finished because the technology at the time was not advanced enough, and both of his projects lacked financial funding. The computing machines made in the 1900s, and even those today are based on the designs of the Difference Engine and the Analytical Engine. This is why charles Babbage is known as the "Father of Computers."

Augusta Ada Byron, Countess of Lovelace


Much of what we know about Babbage and his machine comes from the papers of Augusta Ada Byron, countess of Lovelace and daughter of the poet Lord Byron. lady Lovelace was a genius in math. curious about Babbage's work, she translated an article about the analytical engine from French to English. she added some important notes of her own about how the machine should work. She outlines the fundamentals of computer programming, including data analysis, looping and memory addressing.
Lady Lovelace also helped Babbage with programs for the Analytical Engine. Many of her ideas are like those used in today's computer programs. Sadly, like Babbage, lady Lovelace never lived to see her ideas used. She died at age 36 while Babbage was still working on the Analytical Engine. Her work has long outlived her, however. She is now called "the first programmer," and a programming language used chiefly by the U.S. government was named Ada in her honor.

Electro-mechanical Machines

1890-Herman Hollerith & his Tabulating Machine

An American inventor by the name of Herman Hollerith wanted to speed up the work involved in taking the government census. In 1890, 50 years after charles Babbage's death, Hollerith invented a machine called the tabulating Machine, using notes that were left by Babbage.
Prior to this invention it took nearly eight years to count everyone in the United States and add up all the information about where people lived, their ages, and what their jobs were. The Tabulating Machine used punched cards to record and sort data or information. Each hole punched meant something. If a hole had been punched, a pin would pass through it to make an electrical contact with mercury in a cup below. This turned motors that moved numbers that counted. Approximately 65 cards could be passed through this computer in a minute, and in 1890 it took only 2.5 years to complete the U.S. Census.
Hollerith did not stop with this one invention. He began a company by the name of the Tabulating Machine Company. Eventually this company changed its name to International Business Machines (IBM)--one of the largest computer companies in the world.

 

1930-Vannevar Bush and the Differential Analyzer


In 1930, Vannevar Bush introduced the first electronic "computer" in the United States. It was an analog device. That is, it could measure quantities that changed continuously, such as temperature and air pressure. It used vacuum tubes to switch electrical signals that performed calculations. Bush's machine could do 25 calculations in a few minutes. To show the results, a pen fixed above a drawing board was used to draw a curve on a graph.

The Differential Analyzer weighed 100 tons, used 2000 vacuum tubes, thousands of relays, 150 motors, and approximately 200 miles of wire.



Double-click to see simulation

 





1944-Howard Aiken and the Mark I


The next major invention in the history of computing began in 1937. In that year Howard Aiken outlined a plan for a machine that could perform math problems involving very large numbers. Because it handled distinct amounts or numbers, it was a digital (rather than analog) device.

In 1944, IBM paid engineers to build Aiken's machine. Called the Mark I, it was made up of 78 adding machines and desk calculators that were connected by almost 500 miles of wires. In one second, the Mark I could add three eight-digit numbers; for example, 12,345,678 plus 90,123,456 plus 78,901,234. It could print out its results on punched cards or on an electric typewriter.

The machine had some serious disadvantages, however; it was enormous--51 feet long and 8 feet high. Its 3,000 electrical switches made a terrible racket as they kicked on and off. The Mark I was expensive and complicated to build. After all, it had one million parts and weighed approximately 5 tons!

Grace Hopper


One of the primary programmers for the Mark I was a woman, Grace Hopper. Hopper found the first computer "bug": a dead moth that had gotten into the Mark II and whose wings were blocking the reading of the holes in the paper tape. The word "bug" had been used to describe a defect since at least 1889 but Hopper is credited with coining the word "debugging" to describe the work to eliminate program faults.

 The offending moth was taped into the log book alongside the official report, which stated: "First actual case of a bug being found."


A Colossus to solve an Enigma


During World War II, secret military communication reached fever pitch. Instructions to armies, air forces and fleets were sent by radio, so the air over Europe was full of easily-interceptible messages. The key was whether or not they could be deciphered, and millions of lives depended on that.
The Enigma Machine was an ingenious invention that the Germans developed to enable them to encipher and decipher messages quickly and accurately with a high degree of complexity. 'UnBreakable' is how they described the Enigma code. Fortunately for the Allies, they were wrong.
The Colossus was built to break the Enigma. One of the designers of the Colossus was Alan Turing. He worked as a cryptographer, decoding codes and ciphers at one of the British government's top-secret establishments located at Bletchley Park.

In January 1943, along with a number of colleagues, Turing began to construct an electronic machine to decode the coding machines. This machine, which they dubbed Colossus, comprised 1,800 vacuum tubes and was completed and working by December of the same year.
By any standards Colossus was one of the world's earliest working programmable electronic digital computers. But it was a special-purpose machine that was really only suited to a narrow range of tasks. Although Colossus was built as a special-purpose computer, it did prove flexible enough to be programmed to execute a variety of different routines.


Generations of Computers

In 1946, tiny electronic pathways called circuits began to perform the counting that had formerly been done by gears and other mechanical parts. In the beginning, a generation refers to the state of improvement in the development of a product. This term is also used in the different advancements of computer technology. With each new generation, the circuitry has gotten smaller and more advanced than the previous generation before it. As a result of the miniaturization, speed, power, and memory of computers has proportionally increased. New discoveries are constantly being developed that affect the way we live, work and play.

 


The First Generation: 1945-1956 (The Vacuum Tube Years)


The first generation computers were huge, slow, expensive, and often undependable. In 1946 two Americans, Presper Eckert, and John Mauchly built the ENIAC (Electronic Numerical Integrator and Computer) electronic computer which used vacuum tubes instead of the mechanical switches of the Mark I. The ENIAC used thousands of vacuum tubes, which took up a lot of space and gave off a great deal of heat just like light bulbs do. The ENIAC led to other vacuum tube type computers like the EDVAC (Electronic Discrete Variable Automatic Computer) and the UNIVAC I (UNIVersal Automatic Computer). The vacuum tube was an extremely important step in the advancement of computers. Vacuum tubes were invented the same time the light bulb was invented by Thomas Edison and worked very similar to light bulbs. It's purpose was to act like an amplifier and a switch. Without any moving parts, vacuum tubes could take very weak signals and make the signal stronger (amplify it). Vacuum tubes could also stop and start the flow of electricity instantly (switch). These two properties made the ENIAC computer possible. The ENIAC gave off so much heat that they had to be cooled by gigantic air conditioners. However, even with these huge coolers, vacuum tubes still overheated regularly. It was time for something new.


Mauchly and Eckert: The ENIAC



The first general-purpose electronic computer appeared in 1946. it was developed by John William Mauchly and John Presper Eckert. They called their machine the Electronic Numerical Integrator and Computer (ENIAC).

ENIAC--1946


Unlike previous counting tools, ENIAC had no mechanical parts, no counters, and no gears. it relied solely on vacuum tubes. Each vacuum tube contained an electronic circuit, a tiny pathway that carried electricity. Each circuit could turn on and off very much the way a light bulb does.

ENIAC operated 1000 times faster than Mark I. It could do 5000 additions per second and 300 multiplications. The cost of this machine was around 3 million dollars.
However, ENIAC had a number of problems. It's 19,000 vacuum tubes took up so much space that it required a room measuring 20 feet by 40 feet! The tubes also produced a lot of heat and were always burning out. On average, 50 tubes burned out each day. Today, for a few cents, you can buy one chip that has more computing power than ENIAC!









John von Neumann & the EDSAC--1949


In 1946 a mathematician named John von Neumann proposed two changes in computer design: (1) The machine's instructions, he said, should be stored inside the computer. (2) Because electronic circuits are either on or off, he suggested that people use a series of 0's or 1's to code all the information they put into the computer. A zero would stand for off; a one would stand for on. This code is called the binary code and is still used today.
The EDSAC (Electronic Delay Storage Automatic Computer) had 3,000 vacuum tubes and the programs were input using paper tapes.






Eckert and Mauchly & the UNIVAC--1951



In 1951, Eckert and Mauchly designed another computer called the UNIVAC (UNIVersal Automatic Computer). It was the first computer to be sold to businesses. UNIVAC contained 5,400 vacuum tubes and used magnetic tapes to give instructions to the computer. The UNIVAC was used to predict the presidential election of Dwight Eisenhower. No one believed the machines prediction at first, but it was very accurate.








The Second Generation: 1956-1963 (The Era of the Transistor)


The transistor computer did not last as long as the vacuum tube computer lasted, but it was no less important in the advancement of computer technology. In 1947 three scientists, John Bardeen, William Shockley, and Walter Brattain working at AT&T's Bell Labs invented what would replace the vacuum tube forever. This invention was the transistor which functions like a vacuum tube in that it can be used to relay and switch electronic signals.

There were obvious differences between the transistor and the vacuum tube. The transistor was faster, more reliable, smaller, and much cheaper to build than a vacuum tube. One transistor replaced the equivalent of 40 vacuum tubes. these transistors were made of solid material, some of which is silicon, an abundant element (second only to oxygen) found in beach sand and glass. Therefore, they were very cheap to produce. Transistors were found to conduct electricity faster and better than vacuum tubes. They were also much smaller and gave off virtually no heat compared to vacuum tubes. Their use marked a new beginning for the computer. Without this invention, space travel in the 1960's would not have been possible. However, a new invention would even further advance our ability to use computers.


The Third Generation: 1965-1970 (Integrated Circuits-Miniaturizing the Computer)


Transistors were a tremendous breakthrough in advancing the computer. However, no one could predict that thousands even now millions of transistors (circuits) could be compacted in such a small space. The integrated circuit, or as it is sometimes referred to as semiconductor chip, packs a huge number of transistors onto a single wafer of silicon. Robert Noyce of Fairchild Corporation and Jack Kilby of Texas Instruments independently discovered the amazing attributes of integrated circuits. Placing such large numbers of transistors on a single chip vastly increased the power of a single computer and lowered its cost considerably.

Since the invention of integrated circuits, the number of transistors that can be placed on a single chip has doubled every two years, shrinking both the size and cost of computers even further and further enhancing its power. Most electronic devices today use some form of integrated circuits placed on printed circuit boards--thin pieces of bakelite or fiberglass that have electrical connections etched onto them--sometimes called a mother board.
These third generation computers could carry out instructions in billionths of a second. The size of these machines dropped to the size of small file cabinets. Yet, the single biggest advancement in the computer era was yet to be discovered.




The Fourth Generation: 1971-Today (The Microprocessor)



This generation can be characterized by both the jump to monolithic integrated circuits (millions of transistors put onto one integrated circuit chip) and the invention of the microprocessor (a single chip that could do all the processing of a full-scale computer). By putting millions of transistors onto one single chip more calculation and faster speeds could be reached by computers. Because electricity travels about a foot in a billionth of a second, the smaller the distance the greater the speed of computers.
However, what really triggered the tremendous growth of computers and its significant impact on our lives is the invention of the microprocessor. Ted Hoff, employed by Intel (Robert Noyce's new company) invented a chip the size of a pencil eraser that could do all the computing and logic work of a computer. The microprocessor was made to be used in calculators, not computers. It led, however, to the invention of personal computers, or microcomputers.

It wasn't until the 1970's that people began buying computers for personal use. One of the earliest personal computers was the Altair 8800 computer kit. In 1975, you could purchase this kit and put it together to make your own personal computer. In 1977, the Apple II was sold to the public and in 1981 IBM entered the PC (personal computer) market.

Today we have all heard of Intel and its Pentium processors and now we know how it all got started. The computers of the next generation will have millions upon millions of transistors on one chip and will perform over a billion calculations in a single second. There is no end in sight for the computer movement.






The Apple Story


Steven Paul Jobs and Stephen Wozniak were teenagers when the microcomputer was invented. They grew up in Silicon Valley, an area near Palo Alto, California, known for computer and electronic industries. Later they got jobs there as engineers. Being interested in computers, they joined the Homebrew Computer Club and began tinkering with computers in earnest.
In 1976, Wozniak, who had been interested in computers since fourth grade, decided to build a small computer that would be easy to use. His friends were impressed with it, and Jobs wanted to market it. The two started their business, Apple Computer, Inc., with the $1,300 they got by selling Job's Volkswagen bus and Wozniak's scientific calculator.
The first Apple II, named in memory of Jobs's summers spent picking apples in the Northwest, was a huge success. Since then, Apple has made many computers, including the Apple II Plus, Apple IIe, Apple IIc, Apple IIGS, Macintosh, iMac, iPod, and iPhone.

 


The Story Behind Bill Gates and Microsoft


The Birth of Microsoft

In December of 1974, Paul Allen was on his way to visit Gates when along the way he stopped to browse the current magazines. What he saw changed his and Bill Gates's lives forever. On the cover of Popular Electronics was a picture of the Altair 8080 and the headline "World's First Microcomputer Kit to Rival Commercial Models." He bought the issue and rushed over to Gates's dorm room. They both recognized  this as their big opportunity. The two knew that the home computer market was about to explode and that someone would need to make software for the new machines. Within a few days, Gates had called MITS (Micro Instrumentation and Telemetry Systems), the makers of the Altair. He told the company that he and Allen had developed a BASIC that could be used on the Altair. This was a lie. They had not even written a line of code. They had neither an Altair nor the chip that ran the computer. The MITS company did not know this and was very interested in seeing their BASIC. So, Gates and Allen began working feverishly on the BASIC they had promised. The code for the program was left mostly up to Bill Gates while Paul Allen began working on a way to simulate the Altair with the schools PDP-10. Eight weeks later, the two felt their program was ready. Allen was to fly to MITS and show off their creation. The day after Allen arrived at MITS, it was time to test their BASIC. Entering the program into the company's Altair was the first time Allen had ever touched one. If the Altair simulation he designed or any of Gates's code was faulty, the demonstration would most likely have ended in failure. This was not the case, and the program worked perfectly the first time. MITS arranged a deal with Gates and Allen to buy the rights to their BASIC. Gates was convinced that the software market had been born. Within a year, Bill Gates had dropped out of Harvard and Microsoft was formed.






The Fifth Generation: Present & Beyond (Artificial Intelligence)

Defining the fifth generation of computers is somewhat difficult because the field is in its infancy. The most famous example of a fifth generation computer is the fictional HAL9000 from Arthur C. Clarke's novel, 2001: A Space Odyssey. HAL performed all of the functions currently envisioned for real-life fifth generation computers. With artificial intelligence, HAL could reason well enough to hold conversations with its human operators, use visual input, and learn from its own experiences. (Unfortunately, HAL was a little too human and had a psychotic breakdown, commandeering a spaceship and killing most humans on board.)

  2001: A Space Odyssey

 2010: The Year We Make Contact
 Artificial Intelligence--A.I.
 I, Robot
The movie industry has produced many movies about artificial intelligence and how this generation of computers might happen in the future.
Though the wayward HAL9000 may be far from the reach of real-life computer designers, many of its functions are not. Using recent engineering advances, computers are able to accept spoken word instructions (voice recognition) and imitate human reasoning. The ability to translate a foreign language is also moderately possible with fifth generation computers. This feat seemed a simple objective at first, but appeared much more difficult when programmers realized that human understanding relies as much on context and meaning as it does on the simple translation of words.
Computers today have some attributes of fifth generation computers. For example, expert systems assist doctors in making diagnoses by applying the problem-solving steps a doctor might use in assessing a patient's needs. It will take several more years of development before expert systems are in widespread use.

0 comments:

Post a Comment