Vous êtes sur la page 1sur 14

Kervin Comendador

BSEcE IV-G

Computer History and Development


Nothing epitomizes modern life better than the computer. For better or worse,
computers have infiltrated every aspect of our society. Today computers do much
more than simply compute: supermarket scanners calculate our grocery bill while
keeping store inventory; computerized telephone switching centers play traffic cop
to millions of calls and keep lines of communication untangled; and automatic teller
machines (ATM) let us conduct banking transactions from virtually anywhere in the
world. But where did all this technology come from and where is it heading?

Early Computing Machines and Inventors

The abacus, which emerged about 5,000 years ago in Asia Minor and is still in use
today, may be considered the first computer. This device allows users to make
computations using a system of sliding beads arranged on a rack. Early merchants
used the abacus to keep trading transactions. But as the use of paper and pencil
spread, particularly in Europe, the abacus lost its importance. It took nearly 12
centuries, however, for the next significant advance in computing devices to
emerge. In 1642, Blaise Pascal (1623-1662), the 18-year-old son of a French tax
collector, invented what he called a numerical wheel calculator to help his father
with his duties. This brass rectangular box, also called a Pascaline, used eight
movable dials to add sums up to eight figures long. Pascal's device used a base of
ten to accomplish this. The drawback to the Pascaline, of course, was its limitation
to addition.

In 1694, a German mathematician and philosopher, Gottfried Wilhem von Leibniz


(1646-1716), improved the Pascaline by creating a machine that could also multiply.
Like its predecessor, Leibniz's mechanical multiplier worked by a system of gears
and dials. Partly by studying Pascal's original notes and drawings, Leibniz was able
to refine his machine. It wasn't until 1820, however, that mechanical calculators
gained widespread use. Charles Xavier Thomas de Colmar, a Frenchman, invented a
machine that could perform the four basic arithmetic functions. Colmar's

mechanical calculator, the arithometer, with its enhanced versatility, was widely
used up until the First World War. Although later inventors refined Colmar's
calculator, together with fellow inventors Pascal and Leibniz, he helped define the
age of mechanical computation.

The real beginnings of computers as we know them today, however, lay with an
English mathematics professor, Charles Babbage (1791-1871). Frustrated at the
many errors he found while examining calculations for the Royal Astronomical
Society, Babbage declared, "I wish to God these calculations had been performed by
steam!" With those words, the automation of computers had begun. By 1812,
Babbage noticed a natural harmony between machines and mathematics: machines
were best at performing tasks repeatedly without mistake; while mathematics,
particularly the production of mathematics tables, often required the simple
repetition of steps. The problem centered on applying the ability of machines to the
needs of mathematics. Babbage's first attempt at solving this problem was in 1822
when he proposed a machine to perform differential equations, called a Difference
Engine. Powered by steam and large as a locomotive, the machine would have a
stored program and could perform calculations and print the results automatically.
After working on the Difference Engine for 10 years, Babbage was suddenly inspired
to begin work on the first general-purpose computer, which he called the Analytical
Engine. Babbage's assistant, Augusta Ada King, Countess of Lovelace (1815-1842)
and daughter of English poet Lord Byron, was instrumental in the machine's design.
One of the few people who understood the Engine's design as well as Babbage, she
helped revise plans, secure funding from the British government, and communicate
the specifics of the Analytical Engine to the public. Also, Lady Lovelace's fine
understanding of the machine allowed her to create the instruction routines to be
fed into the computer, making her the first female computer programmer. In the
1980's, the U.S. Defense Department named a programming language ADA in her
honor.

Babbage's steam-powered Engine, although ultimately never constructed, may


seem primitive by today's standards. However, it outlined the basic elements of a
modern general purpose computer and was a breakthrough concept. Consisting of
over 50,000 components, the basic design of the Analytical Engine included input
devices in the form of perforated cards containing operating instructions and a
"store" for memory of 1,000 numbers of up to 50 decimal digits long. It also
contained a "mill" with a control unit that allowed processing instructions in any
sequence, and output devices to produce printed results. Babbage borrowed the
idea of punch cards to encode the machine's instructions from the Jacquard loom.
The loom, produced in 1820 and named after its inventor, Joseph-Marie Jacquard,
used punched boards that controlled the patterns to be woven.

In 1889, an American inventor, Herman Hollerith (1860-1929), also applied the


Jacquard loom concept to computing. His first task was to find a faster way to
compute the U.S. census. The previous census in 1880 had taken nearly seven
years to count and with an expanding population, the bureau feared it would take
10 years to count the latest census. Unlike Babbage's idea of using perforated cards
to instruct the machine, Hollerith's method used cards to store data information
which he fed into a machine that compiled the results mechanically. Each punch on
a card represented one number, and combinations of two punches represented one
letter. As many as 80 variables could be stored on a single card. Instead of ten
years, census takers compiled their results in just six weeks with Hollerith's
machine. In addition to their speed, the punch cards served as a storage method for
data and they helped reduce computational errors. Hollerith brought his punch card
reader into the business world, founding Tabulating Machine Company in 1896, later
to become International Business Machines (IBM) in 1924 after a series of mergers.
Other companies such as Remington Rand and Burroghs also manufactured punch
readers for business use. Both business and government used punch cards for data
processing until the 1960's.

In the ensuing years, several engineers made other significant advances. Vannevar
Bush (1890-1974) developed a calculator for solving differential equations in 1931.
The machine could solve complex differential equations that had long left scientists
and mathematicians baffled. The machine was cumbersome because hundreds of
gears and shafts were required to represent numbers and their various relationships
to each other. To eliminate this bulkiness, John V. Atanasoff (b. 1903), a professor at
Iowa State College (now called Iowa State University) and his graduate student,
Clifford Berry, envisioned an all-electronic computer that applied Boolean algebra to
computer circuitry. This approach was based on the mid-19th century work of
George Boole (1815-1864) who clarified the binary system of algebra, which stated
that any mathematical equations could be stated simply as either true or false. By
extending this concept to electronic circuits in the form of on or off, Atanasoff and
Berry had developed the first all-electronic computer by 1940. Their project,
however, lost its funding and their work was overshadowed by similar developments
by other scientists.

Five Generations of Modern Computers

First Generation (1945-1956)

With the onset of the Second World War, governments sought to develop computers
to exploit their potential strategic importance. This increased funding for computer
development projects hastened technical progress. By 1941 German engineer
Konrad Zuse had developed a computer, the Z3, to design airplanes and missiles.
The Allied forces, however, made greater strides in developing powerful computers.
In 1943, the British completed a secret code-breaking computer called Colossus to
decode German messages. The Colossus's impact on the development of the
computer industry was rather limited for two important reasons. First, Colossus was
not a general-purpose computer; it was only designed to decode secret messages.
Second, the existence of the machine was kept secret until decades after the war.

American efforts produced a broader achievement. Howard H. Aiken (1900-1973), a


Harvard engineer working with IBM, succeeded in producing an all-electronic
calculator by 1944. The purpose of the computer was to create ballistic charts for
the U.S. Navy.
It was about half as long as a football field and contained about 500 miles of wiring.
The Harvard-IBM Automatic Sequence Controlled Calculator, or Mark I for short, was
an electronic relay computer. It used electromagnetic signals to move mechanical
parts. The machine was slow (taking 3-5 seconds per calculation) and inflexible (in
that sequences of calculations could not change); but it could perform basic
arithmetic as well as more complex equations.

Another computer development spurred by the war was the Electronic Numerical
Integrator and Computer (ENIAC), produced by a partnership between the U.S.
government and the University of Pennsylvania. Consisting of 18,000 vacuum tubes,
70,000 resistors and 5 million soldered joints, the computer was such a massive
piece of machinery that it consumed 160 kilowatts of electrical power, enough
energy to dim the lights in an entire section of Philadelphia. Developed by John
Presper Eckert (1919-1995) and John W. Mauchly (1907-1980), ENIAC, unlike the
Colossus and Mark I, was a general-purpose computer that computed at speeds
1,000 times faster than Mark I.

In the mid-1940's John von Neumann (1903-1957) joined the University of


Pennsylvania team, initiating concepts in computer design that remained central to
computer engineering for the next 40 years. Von Neumann designed the Electronic
Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both
a stored program as well as data. This "stored memory" technique as well as the
"conditional control transfer," that allowed the computer to be stopped at any point
and then resumed, allowed for greater versatility in computer programming. The

key element to the von Neumann architecture was the central processing unit,
which allowed all computer functions to be coordinated through a single source. In
1951, the UNIVAC I (Universal Automatic Computer), built by Remington Rand,
became one of the first commercially available computers to take advantage of
these advances. Both the U.S. Census Bureau and General Electric owned UNIVACs.
One of UNIVAC's impressive early achievements was predicting the winner of the
1952 presidential election, Dwight D. Eisenhower.

First generation computers were characterized by the fact that operating


instructions were made-to-order for the specific task for which the computer was to
be used. Each computer had a different binary-coded program called a machine
language that told it how to operate. This made the computer difficult to program
and limited its versatility and speed. Other distinctive features of first generation
computers were the use of vacuum tubes (responsible for their breathtaking size)
and magnetic drums for data storage.

Second Generation Computers (1956-1963)


By 1948, the invention of the transistor greatly changed the computer's
development. The transistor replaced the large, cumbersome vacuum tube in
televisions, radios and computers. As a result, the size of electronic machinery has
been shrinking ever since. The transistor was at work in the computer by 1956.
Coupled with early advances in magnetic-core memory, transistors led to second
generation computers that were smaller, faster, more reliable and more energyefficient than their predecessors. The first large-scale machines to take advantage
of this transistor technology were early supercomputers, Stretch by IBM and LARC
by Sperry-Rand. These computers, both developed for atomic energy laboratories,
could handle an enormous amount of data, a capability much in demand by atomic
scientists. The machines were costly, however, and tended to be too powerful for
the business sector's computing needs, thereby limiting their attractiveness. Only
two LARCs were ever installed: one in the Lawrence Radiation Labs in Livermore,
California, for which the computer was named (Livermore Atomic Research
Computer) and the other at the U.S. Navy Research and Development Center in
Washington, D.C. Second generation computers replaced machine language with
assembly language, allowing abbreviated programming codes to replace long,
difficult binary codes.

Throughout the early 1960's, there were a number of commercially successful


second generation computers used in business, universities, and government from
companies such as Burroughs, Control Data, Honeywell, IBM, Sperry-Rand, and

others. These second generation computers were also of solid state design, and
contained transistors in place of vacuum tubes. They also contained all the
components we associate with the modern day computer: printers, tape storage,
disk storage, memory, operating systems, and stored programs. One important
example was the IBM 1401, which was universally accepted throughout industry,
and is considered by many to be the Model T of the computer industry. By 1965,
most large business routinely processed financial information using second
generation computers.

It was the stored program and programming language that gave computers the
flexibility to finally be cost effective and productive for business use. The stored
program concept meant that instructions to run a computer for a specific function
(known as a program) were held inside the computer's memory, and could quickly
be replaced by a different set of instructions for a different function. A computer
could print customer invoices and minutes later design products or calculate
paychecks. More sophisticated high-level languages such as COBOL (Common
Business-Oriented Language) and FORTRAN (Formula Translator) came into common
use during this time, and have expanded to the current day. These languages
replaced cryptic binary machine code with words, sentences, and mathematical
formulas, making it much easier to program a computer. New types of careers
(programmer, analyst, and computer systems expert) and the entire software
industry began with second generation computers.

Third Generation Computers (1964-1971)


Though transistors were clearly an improvement over the vacuum tube, they still
generated a great deal of heat, which damaged the computer's sensitive internal
parts. The quartz rock eliminated this problem. Jack Kilby, an engineer with Texas
Instruments, developed the integrated circuit (IC) in 1958. The IC combined three
electronic components onto a small silicon disc, which was made from quartz.
Scientists later managed to fit even more components on a single chip, called a
semiconductor. As a result, computers became ever smaller as more components
were squeezed onto the chip. Another third-generation development included the
use of an operating system that allowed machines to run many different programs
at once with a central program that monitored and coordinated the computer's
memory.

Fourth Generation (1971-Present)

After the integrated circuits, the only place to go was down - in size, that is. Large
scale integration (LSI) could fit hundreds of components onto one chip. By the
1980's, very large scale integration (VLSI) squeezed hundreds of thousands of
components onto a chip. Ultra-large scale integration (ULSI) increased that number
into the millions. The ability to fit so much onto an area about half the size of a U.S.
dime helped diminish the size and price of computers. It also increased their power,
efficiency and reliability. The Intel 4004 chip, developed in 1971, took the integrated
circuit one step further by locating all the components of a computer (central
processing unit, memory, and input and output controls) on a minuscule chip.
Whereas previously the integrated circuit had had to be manufactured to fit a
special purpose, now one microprocessor could be manufactured and then
programmed to meet any number of demands. Soon everyday household items
such as microwave ovens, television sets and automobiles with electronic fuel
injection incorporated microprocessors.

Such condensed power allowed everyday people to harness a computer's power.


They were no longer developed exclusively for large business or government
contracts. By the mid-1970's, computer manufacturers sought to bring computers to
general consumers. These minicomputers came complete with user-friendly
software packages that offered even non-technical users an array of applications,
most popularly word processing and spreadsheet programs. Pioneers in this field
were Commodore, Radio Shack and Apple Computers. In the early 1980's, arcade
video games such as Pac Man and home video game systems such as the Atari
2600 ignited consumer interest for more sophisticated, programmable home
computers.

In 1981, IBM introduced its personal computer (PC) for use in the home, office and
schools. The 1980's saw an expansion in computer use in all three arenas as clones
of the IBM PC made the personal computer even more affordable. The number of
personal computers in use more than doubled from 2 million in 1981 to 5.5 million
in 1982. Ten years later, 65 million PCs were being used. Computers continued their
trend toward a smaller size, working their way down from desktop to laptop
computers (which could fit inside a briefcase) to palmtop (able to fit inside a breast
pocket). In direct competition with IBM's PC was Apple's Macintosh line, introduced
in 1984. Notable for its user-friendly design, the Macintosh offered an operating
system that allowed users to move screen icons instead of typing instructions. Users
controlled the screen cursor using a mouse, a device that mimicked the movement
of one's hand on the computer screen.
As smaller computers became more powerful, they could be linked together, or
networked, to share memory space, software, information and communicate with

each other. As opposed to a mainframe computer, which was one powerful


computer that shared time with many terminals for many applications; networked
computers allowed individual computers to form electronic co-ops. Using either
direct wiring, called a Local Area Network (LAN), or telephone lines, these networks
could reach enormous proportions. A global web of computer circuitry, the Internet,
for example, links computers worldwide into a single network of information. During
the 1992 U.S. presidential election, vice-presidential candidate Al Gore promised to
make the development of this so-called "information superhighway" an
administrative priority. Though the possibilities envisioned by Gore and others for
such a large network are often years (if not decades) away from realization, the
most popular use today for computer networks such as the Internet is electronic
mail, which allows users to type in a computer address and send messages through
networked terminals across the office or across the world.

Fifth Generation (Present and Beyond)


Defining the fifth generation of computers is somewhat difficult because the field is
in its infancy. The most famous example of a fifth generation computer is the
fictional HAL9000 from Arthur C. Clarke's novel, 2001: A Space Odyssey. With
artificial intelligence, HAL could reason well enough to hold conversations with its
human operators, use visual input, and learn from its own experiences.

Though the wayward HAL9000 may be far from the reach of real-life computer
designers, many of its functions are not. Using recent engineering advances,
computers are able to accept spoken word instructions (voice recognition) and
imitate human reasoning. The ability to translate a foreign language is also
moderately possible with fifth generation computers. This feat seemed a simple
objective at first, but appeared much more difficult when programmers realized that
human understanding relies as much on context and meaning as it does on the
simple translation of words.

Many advances in the science of computer design and technology are coming
together to enable the creation of fifth-generation computers. Two such engineering
advances are parallel processing, which replaces von Neumann's single central
processing unit design with a system harnessing the power of many CPUs to work
as one. Another advance is superconductor technology, which allows the flow of
electricity with little or no resistance, greatly improving the speed of information
flow. Computers today have some attributes of fifth generation computers. It will
take several more years of development before expert systems are in widespread
use.

Kervin Comendador
BSEcE IV-G

The Development of Microprocessor


The first Integrated Circuit(IC) was invented in late 1958 by Jack S. Kilby working for
Texas Instruments. The company was an innovative manufacturer of transistors and
Kilbys first job with the company was designing micromodules for the military. This
involved connecting many germanium wafers of discrete components together by
stacking each wafer on top of one another.
Connections were made by running wires up the sides of the wafers. Kilby saw this
process as unnecessarily complicated. He realized that if a piece of germanium was
engineered properly, it could act as many components simultaneously. Thus the first
IC was born. A year later, Fairchild Semiconductor (founded in 1957), a division of
Fairchild Camera & Instrument Corporation invented the modern silicon diffusing
process, or planar process which is still used today. The IC process gradually
evolved over the next ten years including moving the development process over to
computer aided design in 1967.
In June 1968, Robert Noyce (who had helped in developing Fairchild's silicon
process), Gordon Moore and Andrew Grove resigned from Fairchild and founded
their own company. Intel (short for Integrated Electronics) was born. The departure
of the three men was significant not least because Robert Noyce was a co-founder
and vice president of Fairchild.
The reason behind the departure was the skepticism of the Fairchild managers
towards the future of the integrated circuit. Thus Fairchild's subsidiary
semiconductor operation resented the managers as they felt the invention had a
great deal to offer.

Integrated CPU
Busicom, a trading name of a now defunct Japanese company called ETI, was
planning a range of next generation programmable calculators. Busicom had
designed 12 chips and asked Intel to produce them. This was as a result of Intel's
expertise with high transistor count chips. Marcian Hoff Jr. was assigned to the
project and after studying the designs concluded its complexity far exceeded the
usefulness of a calculator. Hoff was able to contrast the design with the DEC
Program Data Processor 8 (PDP-8). The PDP-8 had a relatively simple architecture,
yet could perform very high level operations. Hoff realized a general purpose

processor would be a simpler design, yet able to handle a greater number of tasks.
The MCS-4 chipset and in particular, the 4004 integrated CPU were thus conceived.
In 1969 Busicom chose Intel's "Microcomputer on a chip" (the word microprocessor
wasn't used until 1972) over its own 12 chip design. Busicom's contract with Intel
stated Busicom had exclusive rights to buy the new chip set (CPU, ROM, RAM, I/O),
however Intel agreed with Busicom in 1971 that in exchange for cheaper chip
prices, Intel would have full marketing rights enabling them to sell the chips to
whoever wanted to buy.

Intel CPU Design to the 8086


In late 1969, after the 4004 instruction set had been defined, Computer Terminals
Corporation (CTC) asked Intel to develop an LSI registers chip for a new intelligent
terminal they were developing. Due to experience with the 4004 and the furious
pace of development within the industry, Stan Mazor (who had helped on the 4004)
and Hoff agreed that they would put the complete processor on one chip. The 8008,
an 8 bit processor was defined and work began immediately. The chip was rejected
by CTC as it required many support chips (a minimum of 20 TTL packages for
memory and I/O) and was too slow. Chip design continued in parallel to the MCS-4
and in April 1972 the 45 instruction CPU was launched.
The chip became a great success. Intel looked at the CTC rejection of the 8008 and
realized they had to make a general purpose processor requiring only a handful of
support chips. The Intel 8080 was born in April 1974 even though it was announced
earlier. Intel did this to give customers sufficient lead times to design the chip into
their products. The 8080 had 4,500 transistors, twice the number in the 4004 and
could address 64K bytes of memory. Its speed was mainly down to the use of
electron-doped technology as opposed to hole-doped MOS transistors. The chip was
an astounding success and became an industry standard, emulated by other
companies.
In 1978, Intel produced its first 16 bit processor, the 8086. It was source compatible
with the 8080 and 8085 (an 8080 derivative). This chip has probably had more
effect on the present day computer market than any other, although whether this is
justified is debatable; the chip was compatible with the 4 year old 8080 and this
meant it had to use a most unusual overlapping segment register process to access
a full 1 Megabyte of memory.

The Early Years: Not Just Intel


Although Intel had invented the microprocessor and had grown from a three man
start-up in 1968 to an industrial giant by 1981 with 20,000 employees and revenues
of $188 million, they were not the only company developing microprocessors. By

July 1974, 19 microprocessors were either available or announced. By 1975, that


number increased to 40 and by 1976 it was 54. Late 1972 saw the second ever
processor, with Rockwells PPS-4, a 4 bit processor. Another 4 bit processor, the
Texas Instruments TMS 1000 was introduced on the market in 1974, although it had
been designed in 1971. This was around the same time as Intel's 4004, but TI failed
to realize the potential, and left the TMS 1000 to spend its first three years
controlling a Texas Instruments calculator. Surprisingly, the TMS 1000 was also the
first microcontroller, as it contained its own RAM and ROM on chip.
By the late 1970s, the cost of the chip had fallen to a few dollars, and had become
the processor of choice for consumer electronics. It was being produced in over
forty variants and sold in the hundreds of millions. The staggering development in
the field was also exemplified in 1974 by the National Semiconductor Processing
And Control Element (PACE). National was a Fairchild offshoot and thus had a large
skills base. Unfortunately, the chip was designed using hole-doped MOS transistors.
This resulted in a third of the speed of the chip if instead it had been designed using
electron doping.

Clones
Due to the success of the Intel 8008, Zilog and Motorola produced competing chips.
Motorola realized the potential of the microprocessor after seeing the 8008. In mid
1974 they launched the 6800, a processor in the same market as the 8080.
Motorola launched the 6800 with a wide variety of support in the way of system
oriented hardware. This integration proved a major factor in the popularity of the
6800, as it did not have Intel compatibility to fall back on. Popular as the chip was, it
fell well short of a derivative designed by a group of engineers who left Motorola to
begin their own company. MOS Technologies delivered their 6800 - influenced
processor, the 6502 in 1975.
The 6502 was successful due to simple design, single power line and cheapness. It
became a favorite for the emerging small home computer businesses including
Apple, Commodore and Acorn. Being a simple yet powerful design it was able to
hold its own against the later designed and more powerful Z80. As a result, it had
an influence on the concept of Reduced Instruction Set Computing (RISC) and
especially the Acorn ARM processor.
The Zilog chip, the Z80 was significant in that it was compatible with the 8080 yet
added 80 more instructions. However this compatibility was not unexpected as Zilog
was founded by engineers who had left Intel. Two of those engineers were Frederico
Faggin and Masatoshi Shima who had designed the 4004 and 8080 for Intel. The
Zilog (an acronym in which the Z stands for "the last word, the "i" for integrated
and "log" for logic) Z80 was a very powerful processor including on-chip dynamic
memory refresher circuits. This enabled system designers such as Sir Clive Sinclair
to produce computers with very little extra circuitry and hence at very little cost.

A year after Intel produced their first 16 bit processor; Motorola introduced another
influential and long lived chip, the 68000. It was able to address a massive 16
megabytes and was able, through intricate internal circuitry to act like a 32 bit
processor internally. The chip found fame in the Macintosh, Amiga and Atari
personal computers.

A New Philosophy - RISC


Most commentators see RISC as a modern concept, more akin to the 1990s, yet it
can be traced to 1965 and Seymour Cray's CDC (Control Data Corporation) 6600.
RISC design emphasizes simplicity of processor instruction set, enabling
sophisticated architectural techniques to be employed to increase the speed of
those instructions. A classic example is the VAX architecture where the INDEX
instruction was 45% to 60% faster when replaced by simpler VAX instructions. The
CDC 6600 has many RISC features including a small instruction set of only 64 op
codes, a load/store architecture and register to register operations. Also,
instructions weren't variable lengths, but 15 or 30 bits long.
Although the term RISC was not used, IBM formalized these principles in the IBM
801(1975), an Emitter Coupled Logic (ECL) multichip processor. The architecture
featured a small instruction set, load/store memory operations only, 24 registers
and pipelining .
When RISC became popular in the late eighties, IBM tried to market the design as
the Research OPD (Office Products Division) Mini Processor (ROMP) CPU, but it
wasn't successful. The chip eventually became the heart of the I/O processor for the
IBM 3090. The term RISC first came from one of two University research projects in
the USA. The Berkeley RISC 1 formed the basis for the commercial Scalable
(formerly Sun) Processor Architecture (SPARC) processor, whilst Stanford
University's Microprocessor without Interlocked Pipeline Stages (MIPS) processor
was commercialized and is now owned by Silicon Graphics Inc. The Berkeley RISC I
was begun by David A. Patterson and his colleagues in 1980. He had returned from
a sabbatical at Digital Equipment Corporation in 1979 and had been contemplating
the difficulties surrounding the designing of a CPU containing the VAX architecture.
He submitted a paper to Computer on the subject, but it was rejected on the
grounds of a poor use of silicon. The rejection made Patterson wonder what a good
use of silicon was. This led him "down the RISC path".
The RISC I, II and SPARC families are unusual in that they feature register windows.
A concept where a CPU has only a few registers visible to the programmer, but that
set can be exchanged for another set or window when the programmer chooses.
This was intended to provide a very low subroutine overhead, by facilitating fast
context switches. It was later acknowledged that a clever compiler can produce
code for non-windowed machines which was nearly as efficient as a windowed
processor. Windowing is difficult to implement on a processor, so the concept did

not become popular on other architectures. Around the mid-eighties, the term RISC
became somewhat of a buzzword. Intel applied the term to its 80486 processor
although it was clearly nothing of the sort. Steve Przybylski, a designer on the
Stanford MIPS project satirizes this in his definition of RISC. 'RISC: any computer
announced after 1985'.
Around the time the results of the Stanford and Berkeley projects were released, a
small UK home computer firm, Acorn was looking for a processor to replace the
6502 used in its present line of computers. Their review of commercial
microprocessors including the popular 8086 and 68000 concluded that they were
not advanced enough, so in 1983 began their own project to design a RISC
microprocessor. The result, ARM (Advanced RISC Machine, formerly Acorn RISC
Machine) is probably the closest to true RISC of any processor available.

Parallelism- The Transputer


In 1979, Inmos was formed by the British government to produce innovative silicon
based products competing on the world stage. The formation was partly in response
to the increasing dominance of the market by the USA and the need to provide the
UK with manufacturing facilities. During the summer of 1980, Inmos were working
on its first microprocessor; however events were not smooth with two engineers
having inflexible positions over their idea of the architecture for this microprocessor.
David May who had been recruited from Warwick University and Robert Milne who
had come from Scicon, a specialist company producing complex computer programs
were the engineers. Milne felt that the Transputer, the name given for the Inmos
chip, should be the first chip in the world specially tailored to run Ada. He felt this
was the future of micro- processor design which was in strict contrast to May and
Tony Hoare. Hoare was an academic guru from Warwick where he had worked with
May and shared a simplistic approach to the Transputer design. Iann Barron, who
had been the driving force behind Inmos became tired of this rambling and forced
his view on the team.His views happened to encompass those of May but he also
envisaged the multiplicity of individual processors all working concurrently (+14).
The transputer came to market in 1985 as the T-212, a 16 bit initial version with a
RISC like instruction set. Each chip uniquely had 4 serial links which enabled the
microprocessor to be connected to other Transputers in a network. In 1994, the T9000 was launched. It is a design optimized for use in parallel computers using a
systolic array configuration.

The SuperRISCs
In 1988, DEC formed a small team that would develop a new architecture for the
company. Eleven years previously, it had moved from the PDP-11 to the VAX
architecture, but it was seen that it would start lagging behind by the early 1990s.
The project turned out to be huge with more than 30 engineering groups in 10
countries working on the Alpha AXP architecture as it came to be known.

The team were given a fabulous design opportunity; an architecture that would take
DEC into the 21st century. To accommodate the 15-25 year life span of the
processor, they looked back over the previous 25 years and concluded a 1000 fold
increase in computing power occurred. They envisaged the same for the next 25
years, and so they concluded that their designs would, in the future, be run at 10
times the clock speed, 10 times the instruction issue rate, (10 times superscalar)
and 10 processors working together in a system. To enable the processor to run
multiple operating systems efficiently, they took a novel approach and placed
interrupts, exceptions, memory management and error handling into code called
PALcode (Privileged Architecture Library) which had access to the CPU hardware in a
way which microcode normally has. This enables the Alpha to be unbiased toward a
particular computing style.
The Alpha architecture was chosen to be RISC but crucially focused on pipelining
and multiple instruction issue rather than traditional RISC features such as condition
codes and byte writes to memory, as these slow down the former techniques. The
chip was released in 1992 and in that year entered the Guinness Book of Records as
the world's fastest single-chip microprocessor. While the Alpha attempts to increase
instruction speed by simplifying the architecture and concentrating on clock speed
and multiple issues, the PowerPC from IBM and Motorola is the antithesis to this.
The PowerPC was born out of the needs of IBM and Motorola to develop a high
performance workstation CPU. Apple, another member of the PowerPC alliance
needed a CPU to replace the Motorola 680x0 in its line of Macintosh computers. The
PowerPC is an evolution of IBMs Performance Optimized with Enhanced RISC
(POWER) multichip processors. Motorola contributed the bus interface from their
88110 microprocessor.

Vous aimerez peut-être aussi