Introduction
This assignment looks at the history of computer development, which is often referred to in many reference books and is likened to the different generations of the computer they are central to. Each of these generations of computers is characterized by a major technological advancement that has fundamentally changed the way in which computers perform and operate, resulting in smaller, cheaper, more powerful and more efficient and reliable devices compared to their predecessors. Below I have tried to allude to this fact about the different generation of microprocessor development.
Microprocessors are made possible by the advent of the microcomputer. Before this, electronic Central Processing Units (CPUs) were typically made from bulky discrete switching devices, and later small-scale integrated circuits which contained the equivalent of only a few transistors. By integrating the processor onto one or a very few large-scale integrated circuit packages which contain the equivalent of thousands or even millions of discrete transistors, the cost of the processor power was thus greatly reduced. Since the advent of the Integrated Circuits (IC’s) in the mid-1970s, the microprocessor has become the most prevalent user of IC in its CPU.
The evolution of microprocessors has been known to follow what is termed ‘Moore’s Law’. This law suggested that the complexity of an integrated circuit, with respect to the minimum component cost, doubles every 24 months. This generalisation has proven true since the early 1970’s. From their beginnings as the drivers for calculators, this continuous increase in power, led to the dominance of microprocessors over every other forms of computer; every system from the largest mainframes of that era to the smallest handheld computers now use a microprocessor at its core.
A microprocessor is a single chip integrating all the functions of a central processing unit (CPU) of a computer. It includes all the logical functions, data storage, timing functions and interaction with other peripheral devices. In some cases, the terms ‘CPU’ and ‘microprocessor’ are used interchangeably to denote the same device. Like every genuine engineering marvel, the microprocessor too has evolved through a series of improvements throughout the 20th century. A brief history of the device along with its functioning is described below.
Working of a Microprocessor
It is the central processing unit (CPU) which coordinates all the functions of a computer. It generates timing signals, sends and receives data to and from every peripheral used inside or outside the computer. The commands required to do this are fed into the device in the form of current variations which are converted into meaningful instructions by the use of a Boolean Logic expressions. The processor divides the functions into two categories these are the logical functions and processing functions. The arithmetic and logical unit (ALU) and the Control Unit handle these functions respectively. Communicated of this data is through wires called buses. The address bus carries the ‘address’ of the location with which communication is need while the data bus carries the data that is being exchanged.
Microprocessors types
Microprocessors are categorised in different ways. They are:
CISC (Complex Instruction Set Computers)
RISC(Reduced Instruction Set Computers)
VLIW(Very Long Instruction Word Computers)
Super scalar processors
Microprocessors Evolution
Bardeen and Brattain received the Nobel Prize in Physics, 1956, together with William Shockley, “for their researches on semiconductors and their discovery of the transistor effect.” The invention of the transistor in was a significant development in the world of technology. It could perform the function of a large component used in a computer in the early years. Soon it was found that the function this large component was easily performed by a group of transistors arranged on a single platform. This platform, known as the integrated chip (IC), this turned out to be a very crucial achievement in computing and brought along a revolution in the use of computers. Jack Kilby of Texas Instruments (TI) was honoured with the Nobel Prize for his of invention of the integrated IC, this paved the way for microprocessors development. Robert Noyce of Fairchild made a parallel development in IC technology he was awarded the patent on his device.
IC’s proved that complex functions could be integrated onto a single chip with a highly developed speed and storage capacity. Both Fairchild and Texas Instruments began the mass manufacture of commercial ICs in the early part of the 60’s. Finally, the Intel corporation’s Hoff and Fagin were credited with the design of the first microprocessor.
The world’s first microprocessor was the Intel 4004, the next in line was the 8 bit 8008 microprocessor and this was developed by Intel in 1972 to perform complex functions in sync with the Intel 4004.
This started a new era in computer applications. The use huge computers mainframes was significantly scaled down to a much smaller device that were relatively cheap. Earlier, use was limited to large organization. With the development of the microprocessors, the use of new computers systems trickled down to the common man. The next processor in line was Intel’s 8080 with an 8 bit data bus and a 16 bit address bus. This was amongst the most popular microprocessors of all time.
At the same time Intel were manufacturing there processors the Motorola corporation developed its own 6800 in competition with the Intel’s 8080. Fagin left Intel and formed his own firm “Zilog”. It launched a new microprocessor Z80 in 1980 that was far superior to the previous two versions, direct competition for the large corporations.
Intel developed the 8086 which still serves as the base model for all latest advancements in the microprocessor family. It was largely a complete processor integrating all the required features in it. 68000 by Motorola was one of the first microprocessors to develop the concept of micro coding in its instruction set. They were further developed to 32 bit architectures. Similarly, many players like Zilog, IBM and Apple were successful in getting their own products in the market. However, Intel had a commanding position in the market right through the microprocessor era.
The 1990s saw a large scale application of microprocessors in the personal computer applications developed by the newly formed Apple, IBM and Microsoft Corporation. It witnessed a revolution in the use of computers, which by then was a household entity. The growth of the market was expanded upon due to their use of microprocessors.in industry at all levels. Intel brought out its ‘Pentium Processor’ which is one of the most popular processors in use to date. It has been developed into a family series of excellent processors, pushing into the 21st century.
Microprocessor Architectures
Two dominant computer architectures exist for designing microprocessors and microcontrollers. These two architectures include Harvard and von Neumann. The Harvard and von Neumann architectures consist of four major subsystems: memory, input/output (I/O), arithmetic/logic unit (ALU), and control unit (diagrammed in Figures 1a and 1b). The ALU and control unit operate together to form the central processing unit (CPU). Instructions and data are stored in high-speed memories called registers within the CPU. Each of these components interact together to complete the execution of instructions.
Central Processing Unit
Input – Output
Data Memory
Program memory
(the instruction set)