Wednesday, August 12, 2015
Integrated Circut
"Silicon chip" redirects here. For the electronics magazine, see Silicon Chip.
"Microchip" redirects here. For other uses, see Microchip (disambiguation).
Erasable programmable read-only memory integrated circuits. These packages have a transparent window that shows the die inside. The window allows the memory to be erased by exposing the chip to ultraviolet light.
Integrated circuit from an EPROM memory microchip showing the memory blocks, the supporting circuitry and the fine silver wires which connect the integrated circuit die to the legs of the packaging.
Synthetic detail of an integrated circuit through four layers of planarized copper interconnect, down to the polysilicon (pink), wells (greyish), and substrate (green)
An integrated circuit or monolithic integrated circuit (also referred to as an IC, a chip, or a microchip) is a set of electronic circuits on one small plate ("chip") of semiconductor material, normally silicon. This can be made much smaller than a discrete circuit made from independent electronic components. ICs can be made very compact, having up to several billion transistors and other electronic components in an area the size of a fingernail. The width of each conducting line in a circuit can be made smaller and smaller as the technology advances; in 2008 it dropped below 100 nanometers,and has now been reduced to tens of nanometers.
ICs were made possible by experimental discoveries showing that semiconductor devices could perform the functions of vacuum tubes and by mid-20th-century technology advancements in semiconductor device fabrication. The integration of large numbers of tiny transistors into a small chip was an enormous improvement over the manual assembly of circuits using discrete electronic components. The integrated circuit's mass production capability, reliability and building-block approach to circuit design ensured the rapid adoption of standardized integrated circuits in place of designs using discrete transistors.
ICs have two main advantages over discrete circuits: cost and performance. Cost is low because the chips, with all their components, are printed as a unit by photolithography rather than being constructed one transistor at a time. Furthermore, packaged ICs use much less material than discrete circuits. Performance is high because the IC's components switch quickly and consume little power (compared to their discrete counterparts) as a result of the small size and close proximity of the components. As of 2012, typical chip areas range from a few square millimeters to around 450 mm2, with up to 9 million transistors per mm2.
Integrated circuits are used in virtually all electronic equipment today and have revolutionized the world of electronics. Computers, mobile phones, and other digital home appliances are now inextricable parts of the structure of modern societies, made possible by the low cost of integrated circuits.
Terminology
An integrated circuit is defined as
A circuit in which all or some of the circuit elements are inseparably associated and electrically interconnected so that it is considered to be indivisible for the purposes of construction and commerce.
Circuits meeting this definition can be constructed using many different technologies, including thin-film transistor, thick film technology, or hybrid integrated circuit. However, in general usage integrated circuit has come to refer to the single-piece circuit construction originally known as a monolithic integrated circuit.Invention
Main article: Invention of the integrated circuit
Early developments of the integrated circuit go back to 1949, when German engineer Werner Jacobi (Siemens AG)filed a patent for an integrated-circuit-like semiconductor amplifying device showing five transistors on a common substrate in a 3-stage amplifier arrangement. Jacobi disclosed small and cheap hearing aids as typical industrial applications of his patent. An immediate commercial use of his patent has not been reported.
The idea of the integrated circuit was conceived by Geoffrey W.A. Dummer (1909–2002), a radar scientist working for the Royal Radar Establishment of the British Ministry of Defence. Dummer presented the idea to the public at the Symposium on Progress in Quality Electronic Components in Washington, D.C. on 7 May 1952.[8] He gave many symposia publicly to propagate his ideas, and unsuccessfully attempted to build such a circuit in 1956.
A precursor idea to the IC was to create small ceramic squares (wafers), each containing a single miniaturized component. Components could then be integrated and wired into a bidimensional or tridimensional compact grid. This idea, which seemed very promising in 1957, was proposed to the US Army by Jack Kilby and led to the short-lived Micromodule Program (similar to 1951's Project Tinkertoy).[9] However, as the project was gaining momentum, Kilby came up with a new, revolutionary design: the IC.
Jack Kilby's original integrated circuit
Newly employed by Texas Instruments, Kilby recorded his initial ideas concerning the integrated circuit in July 1958, successfully demonstrating the first working integrated example on 12 September 1958.[10] In his patent application of 6 February 1959,[11] Kilby described his new device as “a body of semiconductor material … wherein all the components of the electronic circuit are completely integrated.” The first customer for the new invention was the US Air Force.
Kilby won the 2000 Nobel Prize in Physics for his part in the invention of the integrated circuit.[14] His work was named an IEEE Milestone in 2009.
Half a year after Kilby, Robert Noyce at Fairchild Semiconductor developed his own idea of an integrated circuit that solved many practical problems Kilby's had not. Noyce's design was made of silicon, whereas Kilby's chip was made of germanium. Noyce credited Kurt Lehovec of Sprague Electric for the principle of p–n junction isolation caused by the action of a biased p–n junction (the diode) as a key concept behind the IC.
Fairchild Semiconductor was also home of the first silicon-gate IC technology with self-aligned gates, the basis of all modern CMOS computer chips. The technology was developed by Italian physicist Federico Faggin in 1968, who later joined Intel in order to develop the very first single-chip Central Processing Unit (CPU) (Intel 4004), for which he received the National Medal of Technology and Innovation in 2010.
Generations
In the early days of simple integrated circuits, the technology's large scale limited each chip to only a few transistors, and the low degree of integration meant the design process was relatively simple. Manufacturing yields were also quite low by today's standards. As the technology progressed, millions, then billions of transistors could be placed on one chip, and good designs required thorough planning, giving rise to new design methods.
Name Signification Year Transistors number[18] Logic gates number[19]
SSI small-scale integration 1964 1 to 10 1 to 12
MSI medium-scale integration 1968 10 to 500 13 to 99
LSI large-scale integration 1971 500 to 20,000 100 to 9,999
VLSI very large-scale integration 1980 20,000 to 1,000,000 10,000 to 99,999
ULSI ultra-large-scale integration 1984 1,000,000 and more 100,000 and more
SSI, MSI and LSI
The first integrated circuits contained only a few transistors. Early digital circuits containing tens of transistors provided a few logic gates, and early linear ICs such as the Plessey SL201 or the Philips TAA320 had as few as two transistors. The number of transistors in an integrated circuit has increased dramatically since then. The term "large scale integration" (LSI) was first used by IBM scientist Rolf Landauer when describing the theoretical concept;[citation needed] that term gave rise to the terms "small-scale integration" (SSI), "medium-scale integration" (MSI), "very-large-scale integration" (VLSI), and "ultra-large-scale integration" (ULSI). The early integrated circuits were SSI.
SSI circuits were crucial to early aerospace projects, and aerospace projects helped inspire development of the technology. Both the Minuteman missile and Apollo program needed lightweight digital computers for their inertial guidance systems; the Apollo guidance computer led and motivated the integrated-circuit technology,[20] while the Minuteman missile forced it into mass-production. The Minuteman missile program and various other Navy programs accounted for the total $4 million integrated circuit market in 1962, and by 1968, U.S. Government space and defense spending still accounted for 37% of the $312 million total production. The demand by the U.S. Government supported the nascent integrated circuit market until costs fell enough to allow firms to penetrate the industrial and eventually the consumer markets. The average price per integrated circuit dropped from $50.00 in 1962 to $2.33 in 1968.[21] Integrated circuits began to appear in consumer products by the turn of the decade, a typical application being FM inter-carrier sound processing in television receivers.
The first MOS chips were small-scale integrated chips for NASA satellites.
The next step in the development of integrated circuits, taken in the late 1960s, introduced devices which contained hundreds of transistors on each chip, called "medium-scale integration" (MSI).
In 1964, Frank Wanlass demonstrated a single-chip 16-bit shift register he designed, with an incredible (for the time) 120 transistors on a single chip.
MSI devices were attractive economically because while they cost little more to produce than SSI devices, they allowed more complex systems to be produced using smaller circuit boards, less assembly work (because of fewer separate components), and a number of other advantages.
Further development, driven by the same economic factors, led to "large-scale integration" (LSI) in the mid-1970s, with tens of thousands of transistors per chip.
SSI and MSI devices often were manufactured by masks created by hand-cutting Rubylith; an engineer would inspect and verify the completeness of each mask. LSI devices contain so many transistors, interconnecting wires, and other features that it is considered impossible for a human to check the masks or even do the original design entirely by hand; the engineer depends on computer programs and other hardware aids to do most of this work.
Integrated circuits such as 1K-bit RAMs, calculator chips, and the first microprocessors, that began to be manufactured in moderate quantities in the early 1970s, had under 4000 transistors. True LSI circuits, approaching 10,000 transistors, began to be produced around 1974, for computer main memories and second-generation microprocessors.
VLSI
Main article: Very-large-scale integration
Upper interconnect layers on an Intel 80486DX2 microprocessor die
The final step in the development process, starting in the 1980s and continuing through the present, was "very-large-scale integration" (VLSI). The development started with hundreds of thousands of transistors in the early 1980s, and continues beyond several billion transistors as of 2009.
Multiple developments were required to achieve this increased density. Manufacturers moved to smaller design rules and cleaner fabrication facilities, so that they could make chips with more transistors and maintain adequate yield. The path of process improvements was summarized by the International Technology Roadmap for Semiconductors (ITRS). Design tools improved enough to make it practical to finish these designs in a reasonable time. The more energy-efficient CMOS replaced NMOS and PMOS, avoiding a prohibitive increase in power consumption.
In 1986 the first one-megabit RAM chips were introduced, containing more than one million transistors. Microprocessor chips passed the million-transistor mark in 1989 and the billion-transistor mark in 2005.[25] The trend continues largely unabated, with chips introduced in 2007 containing tens of billions of memory transistors.[26]
ULSI, WSI, SOC and 3D-IC
To reflect further growth of the complexity, the term ULSI that stands for "ultra-large-scale integration" was proposed for chips of more than 1 million transistors.
Wafer-scale integration (WSI) is a means of building very large integrated circuits that uses an entire silicon wafer to produce a single "super-chip". Through a combination of large size and reduced packaging, WSI could lead to dramatically reduced costs for some systems, notably massively parallel supercomputers. The name is taken from the term Very-Large-Scale Integration, the current state of the art when WSI was being developed.[28]
A system-on-a-chip (SoC or SOC) is an integrated circuit in which all the components needed for a computer or other system are included on a single chip. The design of such a device can be complex and costly, and building disparate components on a single piece of silicon may compromise the efficiency of some elements. However, these drawbacks are offset by lower manufacturing and assembly costs and by a greatly reduced power budget: because signals among the components are kept on-die, much less power is required (see Packaging).
A three-dimensional integrated circuit (3D-IC) has two or more layers of active electronic components that are integrated both vertically and horizontally into a single circuit. Communication between layers uses on-die signaling, so power consumption is much lower than in equivalent separate circuits. Judicious use of short vertical wires can substantially reduce overall wire length for faster operation.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment