Vous êtes sur la page 1sur 8

First Generation: Vacuum Tubes (1940-1956)

The first computer systems used vacuum tubes for circuitry and magnetic drums for memory,
and were often enormous, taking up entire rooms. These computers were very expensive to operate
and in addition to using a great deal of electricity, the first computers generated a lot of heat, which
was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming
language understood by computers, to perform operations, and they could only solve one problem at
a time. It would take operators days or even weeks to set-up a new problem. Input was based on
punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The
UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in
1951.

A UNIVAC computer at the Census Bureau.


Image Source: United States Census Bureau
Recommended Reading: Webopedia's ENIAC definition

Second Generation: Transistors (1956-1963)

The world would see transistors replace vacuum tubes in the second generation of computers.
The transistor was invented at Bell Labs in 1947 but did not see widespread use in computers until the
late 1950s.
The transistor was far superior to the vacuum tube, allowing computers to become smaller,
faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors.
Though the transistor still generated a great deal of heat that subjected the computer to damage, it
was a vast improvement over the vacuum tube. Second-generation computers still relied on punched
cards for input and printouts for output.

From Binary to Assembly

Second-generation computers moved from


cryptic binary machine language to symbolic, or assembly,
languages, which allowed programmers to specify instructions in
words. High-level programming languages were also being
developed at this time, such as early versions
of COBOL and FORTRAN. These were also the first computers that stored their instructions in their
memory, which moved from a magnetic drum to magnetic core technology.
The first computers of this generation were developed for the atomic energy industry.

Third Generation: Integrated Circuits (1964-1971)

The development of the integrated circuit was the hallmark of the third generation of
computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which
drastically increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third generation computers
through keyboards and monitors and interfaced with an operating system, which allowed the device
to run many different applications at one time with a central program that monitored the memory.
Computers for the first time became accessible to a mass audience because they were smaller and
cheaper than their predecessors.

Fourth Generation: Microprocessors (1971-Present)

The microprocessor brought the fourth generation of computers, as thousands of integrated


circuits were built onto a single silicon chip. What in the first generation filled an entire room could now
fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the
computer—from the central processing unitand memory to input/output controls—on a single chip.
In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the
Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas
of life as more and more everyday products began to use microprocessors.
As these small computers became more powerful, they could be linked together to form networks,
which eventually led to the development of the Internet. Fourth generation computers also saw the
development of GUIs, the mouseand handheld devices.

Intel's first microprocessor, the 4004, was conceived by Ted Hoff and Stanley
Mazor.
Image Source: Intel Timeline (PDF)
Fifth Generation: Artificial Intelligence (Present and Beyond)

Fifth generation computing devices, based on artificial intelligence, are still in development,
though there are some applications, such as voice recognition, that are being used today. The use
of parallel processing and superconductors is helping to make artificial intelligence a reality.
Quantum computation and molecular and nanotechnology will radically change the face of
computers in years to come. The goal of fifth-generation computing is to develop devices that
respond to natural language input and are capable of learning and self-organization.

Early Calculating And Computing Machines: From The Abacus To Babbage

The Abacus

There is a long history detailing the invention of computing and calculating machines. The
earliest recorded calculating device is the abacus. Used as a simple computing device for performing
arithmetic, the abacus most likely appeared first in Babylonia (now Iraq) over 5000 years ago. Its more
familiar form today is derived from the Chinese version pictured below.

The abacus is more of a counting device than a true calculator. Nonetheless, it was used for centuries
as a reliable means for doing additions and subtractions.

Napier's Bones
Next, we move ahead several centuries and to Scotland. John Napier was born in 1550 near
Edinburgh. Though most of the details of his education are unknown, he apparently attended St.
Andrews and Cambridge. Napier's fame as a mathematician was secured with his discovery of
logarithms. Tables of logarithms made it easier for astronomers, bankers, and others to reduce the
more complex operations of multiplication and division to simpler additions and subtractions. We will
return to consider the use of logarithms shortly.

During his lifetime, though, Napier was more widely recognized as the inventor of a calculating tool
known as "Napier's Bones." These were a series of rods (often carved from bones) that had squares
inscribed in them. Using the rods, one could perform multiplication by looking up partial products and
summing them. Division could be performed similarly as a series of lookups and subtractions.

Later the rods were mechanized by replacing them with cylinders that could be rotated into position.
For a demonstration of how Napier's Bones work consult

The Slide Rule


As mentioned earlier, John Napier had introduced the use of logarithms. Subsequently, he
collaborated with fellow mathematician Henry Briggs (1561-1630), converting his original logarithmic
calculations to the more familiar base-10 representation used today.

The utility of logs can be seen in the following important results.


a * b = 10 ^ ( log (a) + log (b) ), and
a / b = 10 ^ ( log (a) - log (b) )

[Note: ‘*' means product or "times"; and ‘^' denotes "raised to the power of."]

However, one could not exploit these results without performing some time-consuming tasks. In order to
multiply two numbers a and b,

1. You must look up two logs.


2. You must add them.
3. You must look up the corresponding number whose log is their sum.

Edmund Gunter (1581-1626) fashioned a device to help remedy this situation. Called "Gunter's Scale," it
plotted a logarithmic scale on a two-foot ruler. By adding and subtracting lengths, it was possible to
obtain the results of multiplication and division.

William Oughtred (1574-1660) improved upon Gunter's single ruler in 1630 by combining two circular
scales that could be moved relative to one another. The moving scales eliminated the need for a
divider and thereby became the early ancestor of the modern slide rule. Whether straight or circular,
the slide rule represents an analog calculator because the results of the operations are based on the
continuous scale of distances.

Pascaline
For years historians considered the inventor of the first mechanical calculator to be the French
mathematician, philosopher, and apologist Blaise Pascal (1623-1662). To assist his father's work as a tax
collector, the 19-year-old Pascal created a mechanical device that performed simple addition and
subtraction. Even though the machine brought Pascal some notoriety, it was a commercial failure
because of its inefficiency and difficulties in use.

Subtraction, for example, was performed using addition using nine's complement. Interestingly,
while arithmetic with complements is cumbersome for humans, it is much easier to implement in
machines. A form of complement arithmetic is used by most modern processors today to perform
subtraction with integers.

Stepped Reckoner
The German scientist, mathematician, and philosopher Gottfried Wilhelm von Leibniz (1646-
1716) is the next notable figure in our story. He is the designer of the Stepped Reckoner, the first fully
featured arithmetic calculator capable of performing multiplication and division as well as addition
and subtraction. Its chief feature was the so-called "Leibniz wheel," a gear-shaped metal cylinder that
served as a mechanical multiplier. A crank rotated the collection of cylinders, which turned the wheels
that displayed the digits of the answer. However, Leibniz's prototype device never quite worked
properly. Consequently, the machine is more important for its historical influence than as a practical
device.

Although the Stepped Reckoner was a decimal device, it is interesting to note that by
coincidence, Leibniz was the first mathematician to investigate the properties of base-2 or binary
numbering. As you read earlier, binary coding is the native tongue of computers today
Charles Babbage

The first truly modern pioneer in the history of computing is the Englishman Charles Babbage
(1791-1871). Babbage designed not one but two very different automated calculating machines.
Unfortunately, his ideas were so modern that they far outstripped the capabilities of nineteenth-century
technology. As a result, he was never able to realize them fully; consequently, many of these ideas
were lost to posterity. Much of the mathematical computations done in Babbage's day depended on
consulting mathematical tables. Thus, enterprises such as navigation, science, and commerce relied
heavily on the accuracy of these tables to aid the human "computers" who used them. The process of
creating and publishing these tables was not always reliable, though. As might be expected, errors
could result from either commission or transcription. The individuals doing the original computations
might commit errors, and the publishing process might introduce others. Babbage reasoned that if a
machine could be constructed to automate both of these processes, it would also bypass these
sources of errors. He designed such a device, which he dubbed the Difference Engine. The Difference
Engine not only calculated tables of figures but also prepared plates for printing them. Unfortunately,
the mechanical technology of the day was not suitable for the kinds of specifications needed to
realize a fully functional device. That Babbage was a perfectionist who continually changed its design
did not help matters either.

All in all, for its day, the Difference Engine was an amazing conception. Yet as an automated
calculating device, it was a very special-purpose machine. The machine could perform a
complicated series of computations for a given set of values. The results would be determined, of
course, by both the original values and the process used to make the calculations. But, unfortunately,
the machine could perform just that process and nothing else. To be able to perform another type of
computation would require redesigning and building an entirely different machine. This very problem
occupied Babbage during one of his breaks from work on the Difference Engine in 1833.

The idea that unlocked the solution to the problem was, surprisingly, the contemporary
invention of the Jacquard loom. J. M. Jacquard developed a device that attached to a loom to help
automate the process of weaving. Jacquard's invention automated the action of the weaving
needles to achieve a desired design by using a series of punched cards that directed, or programmed,
this process. That is, the punched cards contained the instructions or program, written in a manner the
weaving loom could understand, to direct the weaving.

Babbage recognized that computation in general could be organized in just this manner: The
computational process could be programmed. The key is that the program does not have to be
hardwired into the machine but fed into it much like the values processed. The advantages are
enormous. Such a machine could not only vary what it processes but also vary how it processes. A
programmable machine would be general purpose rather than limited in its capabilities. Changing the
controlling program would enable the machine to perform an entirely different computational process.

Babbage called his design for such a machine the Analytical Engine. The computations were
directed by punched cards. Some of these cards specified the actual steps of the process to be
performed; others specified the particular values or data to be used by the process. Babbage later
recognized that other means could be used instead of punched cards to program a machine, but
Jacquard's punched cards sparked his original insight. Babbage was assisted in this work on the
Analytical Engine by Lady Ada Lovelace, who is sometimes referred to as the first computer
programmer and in whose honor the recently developed programming language Ada was named.

Had he been able to realize the Analytical Engine as a functioning device, Babbage would
have created the first general-purpose programmable computing machine. But, like the Difference
Engine, it had little life beyond the drawing board.
EARLY CALCULATING
DEVICE
FIRST TO FIFTH
GENERATIONS OF
COMPUTER
SUBMITTED BY:

ALTHEA RAE M. AQUINO

GRADE 7- St. Anthony of Padua

SUBMITTED TO:

MRS. MARY JANE A. MARIGMEN

COMPUTER TEACHER
FIRST TO FIFTH
GENERATIONS OF
COMPUTER
EARLY
CALCULATIN
G DEVICE

Vous aimerez peut-être aussi