The Rise of Moore

Originally published in the Informanté newspaper on Thursday, 5 November, 2015.


In the history of mankind, no invention has spurred more innovation and changed the lives of so many as the invention of the semiconductor. Everyone has heard of them, with semiconductors being at the heart of every electronic device used today, and in the hands of almost every person in the world. And while the name sounds self-explanatory (it semi-conducts?), rarely has a material so ubiquitous been understood by so few that use it. 

A semiconductor, as it name implies, is actually more accurately a poor conductor. Most materials are either conductors of electricity (such as copper) and allow electricity to flow with little resistance, or insulators (such as glass or plastic) which do not allow electricity to flow at all. Semiconductors, though, lie on the edge between conductivity and insulation, and by the judicious application of light, heat or electricity, can vary from one to the other. 

These properties were laid bare for use by Michael Faraday, whose experiments in electricity generation in the 1820’s still underlie most of the electrical power used today. Indeed, Michael Faraday can be called the father of electricity, for it is largely due to his invention of electromagnetic rotary devices that electricity became practical to use.

It is thus no surprise that in 1833, he reported that the resistance of silver sulphide decreased when heated. He, unfortunately, did not have an explanation for this phenomenon, and over the years several more examples of such materials were found. It was not until the discovery of the electron in 1897 by JJ Thomson and the development of atomic theory during the early half of the 20th century that a comprehensive theory on how semiconductors work could be assembled, and better devices built. 

It emerged that the electrical properties of semiconductors can be modified by the application of light, or electric fields, and that semiconductors could thus be used for energy conversion, signal amplification, or switching. This culminated in the invention of the transistor by John Bardeen, Walter Brattain and William Shockley at Bell Labs in the United States. For this achievement, they were awarded the Nobel Prize in Physics in 1956.

Transistors soon replaced vacuum tubes as the switching apparatus of choice amongst electrical engineers, but a more compact solution was to come. In 1958, Jack Kilby developed the first integrated circuit, whereby instead of manually assembling hundreds of discrete transistors, a sheet of semi conductive material would be used and the different transistors ‘etched’ onto it. Kilby still had some problems to solve with his germanium-based solution, however, and about 6 months later, Robert Noyce, co-founder of Fairchild Semiconductor, developed his silicon-based integrated circuit that solved most of the remaining ones.

The development of the integrated circuit allowed for the miniaturization of many devices, and in 1965 Gordon Moore of Fairchild Semiconductor wrote a paper titled “Cramming more components onto integrated circuits,” in which he theorized that the number of components in an integrated circuit would double every 18 months. Moore turned out to be more prophetic than he realised.

In 1968, Gordon Moore and Robert Noyce left Fairchild Semiconductor to found a little company called Intel Corporation. By 1971, Intel had developed the first commercial microprocessor, the Intel 4004. It had 2 300 transistors, at a size of 10 micrometers. By the 1980’s, Intel’s 8088 microprocessor was powering the IBM PC, which made computers a household item. It had 29 000 transistors at 3 micrometers. By 1990 Intel’s 486 processor had 1.2 million transistors at 1 micrometer. In 2004, Intel’s Pentium 4 reached 125 million transistors, constructed 90 nanometers apart. Intel’s latest Core i7 processor packs 2 billion transistors a mere 14 nanometers apart. 
 

It has been said that if automotive development had kept pace with semiconductor development, you’d be able to buy a V32 engine that goes 10 000km/h today, or a 30kg car that gets 1 000 km/l, all at a price of about N$50. But perhaps an Apple to Apple comparison is more apt. An iPhone 5S in 1991 would have cost about US$ 2.84 million comparatively. Or to put it a different way – the iPhone 5S has the same computing speed as the Cray-1 supercomputer of 1975, that fastest computer in the world at the time. The Cray-1 weighed 5.5 tons, and required 115 kilowatts of power to run. 

Today, semiconductors are better understood by quantum physics that explain the movement of electrons in a lattice. And Moore’s law continues to keeps up its pace. Of course, eventually Moore’s law will run into miniaturization problems, but so far, it has beaten all obstacles that were once thought insurmountable. Gordon Moore himself has stated that his law has become more beautiful than he had realized, saying "Moore's law is a violation of Murphy's law. Everything gets better and better." And it has – semiconductor technology is the closest thing humanity has experienced to the technological singularity.

No comments:

Post a Comment