Vous êtes sur la page 1sur 10

HISTORY OF COMPUTER

Computing hardware has been an important component of


the process of calculation and data storage since it became
useful for numerical values to be processed and shared. The
earliest computing hardware was probably some form of tally
stick; later record keeping aids include Phoenician clay shapes
which represented counts of items, probably livestock or
grains, in containers. Something similar is found in early
Minoan excavations. These seem to have been used by the
merchants, accountants, and government officials of the time.

Devices to aid computation have changed from simple


recording and counting devices to the abacus, the slide rule,
analog computers, and more recent electronic computers. Even
today, an experienced abacus user using a device hundreds of
years old can sometimes complete basic calculations more
quickly than an unskilled person using an electronic calculator
— though for more complex calculations, computers out-
perform even the most skilled human.

This article covers major developments in the history of


computing hardware, and attempts to put them in context. For
a detailed timeline of events, see the computing timeline article.
The history of computing article is a related overview and
treats methods intended for pen and paper, with or without the
aid of tables.
The Five Generations of Computers
The history of computer development is often
referred to in reference to the different generations of
computing devices. Each generation of computer is
characterized by a major technological development that
fundamentally changed the way computers operate, resulting
in increasingly smaller, cheaper, and more powerful and more
efficient and reliable devices. Read about each generation and
the developments that led to the current devices that we use
today.
First generation – Vacuum Tubes

ABACUS

The first computers used vacuum tubes for circuitry and


magnetic drums for memory, and were often enormous, taking
up entire rooms. They were very expensive to operate and in
addition to using a great deal of electricity, generated a lot of
heat, which was often the cause of malfunctions. First
generation computers relied on machine language to perform
operations, and they could only solve one problem at a time.
Input was based on punched cards and paper tape, and output
was displayed on printouts.

The UNIVAC and ENIAC computers are examples of first-


generation computing devices. The UNIVAC was the first
commercial computer delivered to a business client, the U.S.
Census Bureau in 1951.
Second Generation - 1956-1963: Transistors

Transistors replaced vacuum tubes and ushered in the


second generation of computers. The transistor was invented in
1947 but did not see widespread use in computers until the late
50s. The transistor was far superior to the vacuum tube,
allowing computers to become smaller, faster, cheaper, more
energy-efficient and more reliable than their first-generation
predecessors. Though the transistor still generated a great deal
of heat that subjected the computer to damage, it was a vast
improvement over the vacuum tube. Second-generation
computers still relied on punched cards for input and printouts
for output.

Second-generation computers moved from cryptic binary


machine language to symbolic, or assembly, languages, which
allowed programmers to specify instructions in words. High-
level programming languages were also being developed at this
time, such as early versions of COBOL and FORTRAN. These
were also the first computers that stored their instructions in
their memory, which moved from a magnetic drum to
magnetic core technology.
Third Generation - 1964-1971: Integrated Circuits

The development of the integrated circuit was the


hallmark of the third generation of computers. Transistors
were miniaturized and placed on silicon chips, called
semiconductors, which drastically increased the speed and
efficiency of computers.

Instead of punched cards and printouts, users interacted with


third generation computers through keyboards and monitors
and interfaced with an operating system, which allowed the
device to run many different applications at one time with a
central program that monitored the memory. Computers for
the first time became accessible to a mass audience because
they were smaller and cheaper than their predecessors.
Fourth Generation - 1971-Present:
Microprocessors

The microprocessor brought the fourth generation of


computers, as thousands of integrated circuits were built onto a
single silicon chip. What in the first generation filled an entire
room could now fit in the palm of the hand. The Intel 4004 chip,
developed in 1971, located all the components of the computer -
from the central processing unit and memory to input/output
controls - on a single chip.

In 1981 IBM introduced its first computer for the home user, and
in 1984 Apple introduced the Macintosh. Microprocessors also
moved out of the realm of desktop computers and into many areas
of life as more and more everyday products began to use
microprocessors.

As these small computers became more powerful, they could be


linked together to form networks, which eventually led to the
development of the Internet. Fourth generation computers also saw
the development of GUIs, the mouse and handheld devices.
Fifth Generation - Present and Beyond: Artificial
Intelligence

Fifth generation computing devices, based on artificial


intelligence, are still in development, though there are some
applications, such as voice recognition, that are being used
today. The use of parallel processing and superconductors is
helping to make artificial intelligence a reality. Quantum
computation and molecular and nanotechnology will radically
change the face of computers in years to come. The goal of
fifth-generation computing is to develop devices that respond
to natural language input and are capable of learn The Fifth
Generation Computer Systems project (FGCS) was an
initiative by Japan's Ministry of International Trade and
Industry, begun in 1982, to create a "fifth generation
computer" (see history of computing hardware) which was
supposed to perform much calculation utilizing massive
parallelism. It was to be the end result of a massive
government/industry research project in Japan during the
1980s. It aimed to create an "epoch-making computer" with
supercomputer-like performance and usable artificial
intelligence capabilities.

The term fifth generation was intended to convey the system as


being a leap beyond existing machines. Computers using
vacuum tubes were called the first generation; transistors and
diodes, the second; ICs, the third; and those using
microprocessors, the fourth. Whereas previous computer
generations had focused on increasing the number of logic
elements in a single CPU, the fifth generation, it was widely
believed at the time, would instead turn to massive numbers of
CPUs for added performance and self-organization.

FIRST GENERATION
World War gave rise to numerous developments and started off the computer age. Electronic
Numerical Integrator and Computer (ENIAC) was produced by a partnership between University
of Pennsylvanian and the US government. It consisted of 18,000 vacuum tubes and 7000
resistors. It was developed by John Presper Eckert and John W. Mauchly and was a general
purpose computer. "Von Neumann designed the Electronic Discrete Variable Automatic Computer
(EDVAC) in 1945 with a memory to hold both a stored program as well as data." Von Neumann's
computer allowed for all the computer functions to be controlled by a single source.

Then in 1951 came the Universal Automatic Computer (UNIVAC I), designed by Remington rand
and collectively owned by US census bureau and General Electric. UNIVAC amazingly predicted
the winner of 1952, presidential elections, Dwight D. Eisenhower.

In first generation computers, the operating instructions or programs were specifically built for
the task for which computer was manufactured. The Machine language was the only way to tell
these machines to perform the operations. There was great difficulty to program these computers
and more when there were some malfunctions. First Generation computers used Vacuum tubes
and magnetic drums (for data storage).

SECOND GENERATION

The invention of Transistors marked the start of the second generation. These transistors took
place of the vacuum tubes used in the first generation computers. First large scale machines were
made using these technologies to meet the requirements of atomic energy laboratories. One of the
other benefits to the programming group was that the second generation replaced Machine
language with the assembly language. Even though complex in itself Assembly language was
much easier than the binary code.

Second generation computers also started showing the characteristics of modern day computers
with utilities such as printers, disk storage and operating systems. Much financial information
was processed using these computers.

In Second Generation computers, the instructions (program) could be stored inside the computer's
memory. High-level languages such as COBOL (Common Business-Oriented Language) and
FORTRAN (Formula Translator) were used, and they are still used for some applications
nowadays.

FOURTH GTENERATION

Fourth Generation computers are the modern day computers. The Size started to go down with
the improvement in the integerated circuits. Very Large Scale (VLSI) and Ultra Large scale
(ULSI) ensured that millions of components could be fit into a small chip. It reduced the size and
price of the computers at the same time increasing power, efficiency and reliability. "The Intel
4004 chip, developed in 1971, took the integrated circuit one step further by locating all the
components of a computer (central processing unit, memory, and input and output controls) on a
minuscule chip."

Due to the reduction of cost and the availability of the computers power at a small place allowed
everyday user to benefit. First came the minicomputers, which offered users different
applications, most famous of these the word processors and spreadsheets, which could be used by
non-technical users. Video game systems like Atari 2600 generated the interest of general
populace in the computers.

In 1981, IBM introduced personal computers for home and office use. "The number of personal
computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years
later, 65 million PCs were being used." Computer size kept getting reduced during the years. It
went down from Desktop to laptops to Palmtops. Machintosh introduced Graphic User Interface
in which the users didn’t' have to type instructions but could use Mouse for the purpose.

The continued improvement allowed the networking of computers for the sharing of data. Local
Area Networks (LAN) and Wide Area Network (WAN), were potential benefits, in that they could
be implemented in corporations and everybody could share data over it. Soon the internet and
World Wide Web appeared on the computer scene and formented the Hi-Tech revolution of 90'

FIFTH GENERATION

Fifth generations computers are only in the minds of advance research scientists and being
tested out in the laboratories. These computers will be under Artificial Intelligence (AI), they will
be able to take commands in a audio visual way and carry out instructions. Many of the
operations which require low human intelligence will be performed by these computers.
Parallel Processing is coming and showing the possibility that the power of many CPU's can be
used side by side, and computers will be more powerful than those under central processing.
Advances in Super Conductor technology will greatly improve the speed of information traffic.
Future looks bright for the computers.

Vous aimerez peut-être aussi