I dare say what I am about to write is no great revelation and indeed I was probably taught this for ‘A’ level maths, but forgot it…

Anyway, I was reading about the Fast Fourier Transform (FFT), which, discovered in the 1960s, radically improved the speed of signal analysis. Older algorithms had been able to analyse signal samples in time, while the FFT manages it in time.

It does this by relying on the fact that one can combine two samples to get a sample in operations, ie.,

in operations
in operations
in operations

Hence we can get samples in doublings. But how many operations does that take?

For the above example () we can see it is and for the more general case we can see this is – times an infinite series.

For the overall complexity of the algorithm to be this series should add to 1. Does it?