Many people have heard of Moore’s Law – which states that the number of transistors that can be packed into a piece of silicon doubles every two years (or 18 months in another formulation). More transistors means greater speed, more and cheaper memory and so on … so in the thirty two years since I first saw a real “micro-computer” raw processing power has increased by over 30000 times on even the conservative formulation of the law.

The contrast with other engineering improvements is enormous: typical lightbulbs have become six or seven times more energy efficient in that period – something that LED technologies might double again in the next few years. Great, but nothing like computing.

But this contribution from the electronics engineers, while of course impressive, is minuscule in comparison to some of the improvements won by the mathematicians and the computer scientists who have perfected better, faster, methods to solve the common problems. These improvements in algorithms are not universal, unlike Moore’s Law, but can be much, more bigger – often in the factor of millions of times faster.

We are on the verge of having another such improvement revealed – to the “fast fourier transform” (FFT) algorithm which allows signals to be broken down into their components for compression and other purposes.

The FFT improvement is said to be (the paper is yet to be published and only the abstract is in the public domain) around a factor of 10 – which could have big implications for any application where fat data has to be pushed over a thin line and some loss of quality is permissible (mobile telephony is one obvious example but home sensors could be another).

The FFT image compression algorithm was patented some years ago, but as a pure mathematical procedure such a patient has no application in UK (or EU) law (and surely that is right – how can it be possible to patent something which already exists?) – it is not clear what the legal position of this innovation will be.

Once again, though, all this shows the fundamental importance of maths and science education to us all – and the importance of pure mathematical and scientific research. A little more emphasis on this and a little less on the “needs of industry” (very few employers are interested in research as opposed to getting their products out of the door a bit faster or cheaper) would be a welcome step forward for policy makers.

###### Related articles

- Faster-than-fast Fourier transform (nextbigfuture.com)
- The Goertzel Algorithm – A faster fast Fourier transform (eetimes.com)
- Algorithm is the Art of The Impossible (easirarafat.wordpress.com)
- New MRI algorithm could reduce the time patients spend in the machine from 45 to 15 minutes (nextbigfuture.com)
- “The first algorithm” (cartesianproduct.wordpress.com)
- Computers Computer basics What is an algorithm /Dan-Marius.ro – my slice of internet / Oradea, Bihor, Romania (supravirtual.wordpress.com)
- Computers Computer basics What is an algorithm /Dan-Marius.ro – my slice of internet / Oradea, Bihor, Romania (danmariuss.wordpress.com)
- Moore’s law squared (johndcook.com)
- 3-d Transistors: Redefigning the Transistor (mgitecetech.wordpress.com)
- Faster, smaller and more economical gallium nitride transistors (physorg.com)

## One thought on “Better algorithms, not better hardware are what make your computer faster, faster”

Comments are closed.