An integrated circuit (IC), popularly known as a silicon chip, computer chip or microchip, is a miniature electronic circuit rendered on a sliver of semiconducting material, typically silicon, but sometimes sapphire. Owing to their tiny measurements and incredible processing power — modern integrated circuits host millions of transistors on boards as small as 5 millimeters (about 0.2 inches) square and 1 millimeter (0.04 inches) thick — they are to be found in virtually every modern-day appliance and device, from credit cards, computers, and mobile phones to satellite navigation systems, traffic lights and airplanes.
Essentially, an integrated circuit is a composite of various electronic components, namely, transistors, resistors, diodes and capacitors, that are organized and connected in a way that produces a specific effect. Each unit in this ‘team’ of electronic components has a unique function within the integrated circuit. The transistor acts like a switch and determines the ‘on’ or ‘off’ status of the circuit; the resistor controls the flow of electricity; the diode permits the flow of electricity only when some condition on the circuit has been met; and finally the capacitor stores electricity prior to its release in a sustained burst.
The first integrated circuit was demonstrated by Texas Instruments’ employee Jack Kilby in 1958. This prototype, measuring about 11.1 by 1.6 millimeters, consisted of a strip of germanium and just one transistor. The advent of silicon coupled with the ever diminishing size of integrated circuits and the rapid increase in the number of transistors per millimeter meant that integrated circuits underwent massive proliferation and gave rise to the age of modern computing.
From its inception in the 1950s to the present day, integrated circuit technology has known various ‘generations’ that are now commonly referred to as Small Scale Integration (SSI), Medium Scale Integration (MSI), Large Scale Integration (LSI), and Very Large Scale Integration (VSLI). These progressive technological generations describe an arc in the progress of IC design that goes to illustrate the prescience of Intel head, George Moore, who coined ‘Moore’s Law’ in the 1960s which asserted that integrated circuits double in complexity every two years.
This doubling in complexity is borne out by the generational movement of the technology that saw SSI’s tens of transistors increase to MSI’s hundreds, then to LSI’s tens of thousands, and finally to VSLI’s millions. The next frontier that integrated circuits promise to breach is that of ULSI, or Ultra-Large Scale Integration, which entails the deployment of billions of microscopic transistors and has already been heralded by the Intel project codenamed Tukwila, which is understood to employ over two billion transistors.
If more proof were needed of the persisting veracity of Moore’s dictum, we have only to look at the modern day integrated circuit which is faster, smaller and more ubiquitous than ever. As of 2008, the semiconductor industry produces more than 267 billion chips a year and this figure is expected to rise to 330 billion by 2012.