Vous êtes sur la page 1sur 2

A microcomputer is a computer with a microprocessor as its central processing unit.

They are
physically small compared to mainframe and minicomputers. Many microcomputers (when
equipped with a keyboard and screen for input and output) are also personal computers (in the
generic sense).[2][3]

The abbreviation "micro" was common during the 1970s and 1980s,[4] but has now fallen out of
common usage.

The term "Microcomputer" came into popular use after the introduction of the minicomputer,
although Isaac Asimov used the term microcomputer in his short story "The Dying Night" as
early as 1956 (published in The Magazine of Fantasy and Science Fiction in July that year).
Most notably, the microcomputer replaced the many separate components that made up the
minicomputer's CPU with one integrated microprocessor chip.

The earliest models such as the Altair 8800 were often sold as kits to be assembled by the user,
and came with as little as 256 bytes of RAM, and no input/output devices other than indicator
lights and switches, useful as a proof of concept to demonstrate what such a simple device could
do.

However, as microprocessors and semiconductor memory became less expensive,


microcomputers in turn grew cheaper and easier to use:

 Increasingly inexpensive logic chips such as the 7400 series allowed cheap dedicated
circuitry for improved user interfaces such as keyboard input, instead of simply a row of
switches to toggle bits one at a time.

 Use of audio cassettes for inexpensive data storage replaced manual re-entry of a
program every time the device was powered on.

 Large cheap arrays of silicon logic gates in the form of Read-only memory and EPROMs
allowed utility programs and self-booting kernels to be stored within microcomputers.
These stored programs could automatically load further more complex software from
external storage devices without user intervention, to form an inexpensive turnkey system
that does not require a computer expert to understand or to use the device.

 Random access memory became cheap enough to afford dedicating approximately 1-2
kilobytes of memory to a video display controller frame buffer, for a 40x25 or 80x25 text
display or blocky color graphics on a common household television. This replaced the
slow, complex, and expensive teletypewriter that was previously common as an interface
to minicomputers and mainframes.

All these improvements in cost and usability resulted in an explosion in their popularity during
the late 1970s and early 1980s.

A large number of computer makers packaged microcomputers for use in small business
applications. By 1979, many companies such as Cromemco, Processor Technology, IMSAI,
Northstar, Southwest Technical Products Corporation, Ohio Scientific, Altos, Morrow Designs
and others produced systems designed either for a resourceful end user or consulting firm to
deliver business systems such as accounting, database management, and word processing to
small businesses. This allowed businesses unable to afford leasing of a minicomputer or time-
sharing service the opportunity to automate business functions, without (usually) hiring a full-
time staff to operate the computers. A representative system of this era would have used an S100
bus, an 8-bit processor such as a Intel 8080 or Zilog Z80, and either CP/M or MP/M operating
system.

The increasing availability and power of desktop computers for personal use attracted the
attention of more software developers. In time, and as the industry matured, the market for
personal computers standardized around IBM PC compatibles running DOS, and later Windows.

Modern desktop computers, video game consoles, laptops, tablet PCs, and many types of
handheld devices, including mobile phones, pocket calculators, and industrial embedded
systems, may all be considered examples of microcomputers according to the definition given
above.

Vous aimerez peut-être aussi