Lots of people have heard of Moore’s law and probably think it means computers get twice as fast every 18 months or two years or so.
Of course, that’s not quite what it really says – which is that we can put twice as many transistors on a chip every two years or so. And for most of the last fifty years that was pretty much equivalent to saying that computers did get twice as fast.
(And, in reality, “Moore’s law” isn’t really a law at all – it began as an observation on the way that commercial pressures were shaping the semi-conductor market, with chip/solid state electronics manufacturers competing by making their offerings more powerful. But it is also a mistake to state, as sometimes happens, that the “law” says we can pack twice as many transistors into a given space every two years – because that is not true either. In fact in the early 70s transistor density was doubling every three years or so, but chip sizes were also increasing, so transistor numbers doubled every 18 months, but by the 1990s the pace of chip growth slowed and so the time taken to double transistor numbers increased to 24 months.)
It’s now nearly a decade since what has been called “the breakdown of Moore’s law” and the switch to multicore processors instead of ever faster single chips. But again, this is wrong. Moore’s law has not broken down at all – transistor numbers are continuing to increase. What has happened is that it is no longer possible to keep running these transistors at ever faster speeds.
Because accompanying the decrease in transistor size which was the fundamental driver what is popularly described as Moore’s law was another, related, effect – “Dennard scaling” (named after Robert Dennard who led the research team from IBM which first described this effect in a 1974 paper). The key effect of Dennard scaling was that as transistors got smaller the power density was constant – so if there was a reduction in a transistor’s linear size by 2, the power it used fell by 4 (with voltage and current both halving).
This is what has broken down – not the ability to etch smaller transistors, but the ability to drop the voltage and the current they need to operate reliably. (In fact the reduction in linear scale of transistors ran slightly ahead of the reduction of other other components at the start of this century, leading to 3+ Ghz chips a bit faster than was expected – though that is all we got to.)
What has happened is that static power losses have increased every more rapidly as a proportion of overall power supplied as voltages have dropped. And static power losses heat the chip, further increasing static power loss and threatening thermal runaway – and complete breakdown.
Reasons for increased static power loss include complex quantum effects as component size falls and the chemical composition of chips is changed to handle the smaller sizes. And there seems to be no way out. “Moore’s law”, as popularly understood in the sense of ever faster chips, is dead and even if we don’t get the science, we understand all the same – as the recent catastrophic fall in PC sales has demonstrated: nobody is interested in buying a new computer for their desktop when the one they already have is not obsolete.
Instead we are buying smaller devices (as Moore’s law makes that possible too) and researchers (me included) are looking at the alternatives. My research is not in electronics, but software for multicore devices, but the electronics researchers have not completely given up hope they can use other methods to build faster chips, but there is nothing to show for it yet (outside the lab – see the links below for some recent developments).
Update: I have edited slightly for sense and to reflect the fact that when Gordon Moore first spoke about what was later termed his “law” there were no integrated circuits available and that recent developments by IBM point to at least the potential of solving the problem by developing new materials that might eliminate or minimise static power loss (see the links below).
- Economics rather than the engineering Marks the Beginning of the end of Moore’s Law (nextbigfuture.com)
- AMD sees the era of Moore’s law coming to a close (zdnet.com)
- IBM’s New Transistors Only Need A Microburst Of Power (gizmodo.com.au)
- Mark Zuckerberg Might Be Right About Moore’s Law Of Sharing (fastcodesign.com)
- Why Computing Won’t Be Limited By Moore’s Law. Ever (readwrite.com)
- Move Over, Moore’s Law (zocalopublicsquare.org)
- Breaking Moore’s Law: How chipmakers are pushing PCs to blistering new levels (pcworld.com)
- IBM rethinks the transistor to keep scaling compute power (gigaom.com)
- Moore’s law: will it last forever? (technologicallyinsane.wordpress.com)
- IBM Patented a One Atom-Thick Graphene Transistor That Works 1,000 Times Faster Than Silicon (motherboard.vice.com)