Vous êtes sur la page 1sur 4

ECE301 Term Paper Review

Submitted to: Mr. Nitin Bhomle

Submitted by: Sonali Sood RB6803A22 Sec-B6803 B.tech-MBA(int.)ECE

Introduction:
Memory management is the operation of managing computer memory. In simple form, it involves giving ways to assign portion of memory to programs at their demand, and release it for reuse when no longer needed. The management of main memory is critical to the computer system. Microprocessor is an integrated circuit semiconductor chip that performs the bulk of the processing and controls the parts of a system. A microprocessor functions as the central processing unit of a microcomputer. A disk drive contains a microprocessor to handle the internal functions of the drive.

Memory Management:
Virtual memory systems separate the memory addresses used by a process from actual physical addresses, allowing separation of processes and increasing the effectively available amount of RAM using disk swapping. The quality of the virtual memory manager can have a big impact on overall system performance. Garbage collection is the automated allocation and de-allocation of computer memory resources for a program. This is generally implemented at the programming language level and is in opposition to manual memory management, the explicit allocation and de-allocation of computer memory resources. Region-based memory management is an efficient variant of explicit memory management that can de-allocate large groups of objects simultaneously.

Microprocessor:
A microprocessor incorporates most or all of the functions of a computer's central processing unit (CPU) on a single integrated circuit (IC, or microchip). The first microprocessors emerged in the early 1970s and were used for electronic calculators, using binary-coded decimal (BCD) arithmetic on 4-bit words. Other embedded uses of 4-bit and 8-bit microprocessors, such as terminals, printers, various kinds of automation etc., followed soon after. Affordable 8-bit microprocessors with 16-bit addressing also led to the first generalpurpose microcomputers from the mid-1970s on.

During the 1960s, computer processors were often constructed out of small and mediumscale ICs containing from tens to a few hundred transistors. The integration of a whole CPU onto a single chip greatly reduced the cost of processing power. From these humble beginnings, continued increases in microprocessor capacity have rendered other forms of computers almost completely obsolete with one or more microprocessors used in everything from the smallest embedded systems and handheld devices to the largest

mainframes and supercomputers. Since the early 1970s, the increase in capacity of microprocessors has been a consequence of Moore's Law, which suggests that the number of transistors that can be fitted onto a chip doubles every two years. Although originally calculated as a doubling every year, Moore later refined the period to two years. In the late 1990s, and in the high-performance microprocessor segment, heat generation (TDP), due to switching losses, static current leakage, and other factors, emerged as a leading developmental constraint. History:

The first microprocessor, the Intel 4004, was made in 1971. It wasn't very fast and it could only add and subtract. Size:

A microprocessor houses all of its computation power on a single chip. A chip is usually a thin square piece of silicon containing millions of transistors. Features:

A microprocessor performs mathematical functions using its arithmetic logic unit, (ALU). Modern processors have the ability to perform large arithmetic computations using floating point processors. Floating point processors allow microprocessors to perform sophisticated computations quickly and accurately.

Significance: A microprocessor helps to move data from one memory location to another. Microprocessors allow you to transfer information from a USB flash drive to your computer's hard drive in a matter of seconds, depending on the size of the file being moved.

Benefits:

Microprocessors make quick decisions and handle multiple instructions based on those quick decisions. The instruction register and instruction decoder, using the binary system of encoding and decoding data, allowing the microprocessor to quickly perform user requested tasks.

Vous aimerez peut-être aussi