# Daily Telegraph’s “Make Britain Count” campaign

Image via Wikipedia

The Daily Telegraph has today launched a “Make Britain Count” campaign.

It’s interesting for several reasons. First of all, despite the paper’s politics it avoids (so far, at least) the likely line that would be taken by the Mail or Express of telling us that we are about to be over-run by hordes of innumerate ruffians from the lower orders, and instead focuses on the relatively small numbers (about 13% in England) taking advanced courses (‘A’ level in England) after 16.

The politics does creep in though – they have repeatedly called it their “numeracy campaign” and yet write:

This deep-rooted problem has not escaped the attention of successive governments. In 1999, David Blunkett introduced the National Numeracy Strategy for all primary schools (updated in 2006). Many questioned the use of the word “numeracy” as a New Labour attempt to rebrand good old-fashioned “mathematics”.

Seems they cannot help themselves, even if it reflects badly on them.

Perhaps the campaign is motivated by the knowledge that many middle class parents often spend large sums on additional tuition for their children in maths – tuition that is often counter productive because it focuses on rote like learning of how to “do sums” rather than an understanding of mathematics.

But is good to see the repeated public boasting in Britain about mathematical ignorance challenged.

# A multiple choice quiz on the Newton-Raphson method

Image via Wikipedia

Maybe I am not very good, but it took me two goes to get up to 5/6 correct answers on this multiple choice quiz on the Newton-Raphson method. The one I still have not got right is the first one.

# A better calculator

Image via Wikipedia

Only a few days ago I was musing to myself that, back in the days when BASIC was all I had, I could easily see what a function plot looked like by writing a dozen or so lines of code, while today I have to load libraries and make specialised calls to their APIs.

Then along comes “A Better Calculator” and it is even easier!

# Better algorithms, not better hardware are what make your computer faster, faster

Image via Wikipedia

Many people have heard of Moore’s Law – which states that the number of transistors that can be packed into a piece of silicon doubles every two years (or 18 months in another formulation). More transistors means greater speed, more and cheaper memory and so on … so in the thirty two years since I first saw a real “micro-computer” raw processing power has increased by over 30000 times on even the conservative formulation of the law.

The contrast with other engineering improvements is enormous: typical lightbulbs have become six or seven times more energy efficient in that period – something that LED technologies might double again in the next few years. Great, but nothing like computing.

But this contribution from the electronics engineers, while of course impressive, is minuscule in comparison to some of the improvements won by the mathematicians and the computer scientists who have perfected better, faster, methods to solve the common problems. These improvements in algorithms are not universal, unlike Moore’s Law, but can be much, more bigger – often in the factor of millions of times faster.

We are on the verge of having another such improvement revealed – to the “fast fourier transform” (FFT) algorithm which allows signals to be broken down into their components for compression and other purposes.

The FFT improvement is said to be (the paper is yet to be published and only the abstract is in the public domain) around a factor of 10 – which could have big implications for any application where fat data has to be pushed over a thin line and some loss of quality is permissible (mobile telephony is one obvious example but home sensors could be another).

The FFT image compression algorithm was patented some years ago, but as a pure mathematical procedure such a patient has no application in UK (or EU) law (and surely that is right – how can it be possible to patent something which already exists?) – it is not clear what the legal position of this innovation will be.

Once again, though, all this shows the fundamental importance of maths and science education to us all – and the importance of pure mathematical and scientific research. A little more emphasis on this and a little less on the “needs of industry” (very few employers are interested in research as opposed to getting their products out of the door a bit faster or cheaper) would be a welcome step forward for policy makers.

# An alternative second half?

Going back to the previous blog and looking for an alternative second half…

We have $\frac{d}{dx}e^x =^{\lim_\delta \to 0}_{\lim_ n \to 0} (1 + n)^{\frac{x}{n}}\frac{(1+n)^{\frac{\delta}{n}}-1}{\delta}$

For this to be true $\frac{(1+n)^{\frac{\delta}{n}}-1}{\delta}=1$

So $\delta = (1+n)^{\frac{\delta}{n}}-1$

$\delta + 1 = (1+n)^{\frac{\delta}{n}}$

Raise both sides to the $n$th power:

$(\delta + 1)^n = (1+n)^\delta$

At $\lim_{\delta \to 0}$ and $\lim_{n \to 0}$ it can be seen that both sides equal 1.

Feels like quite a weak proof to me. But I am not in a position to claim expertise. Thoughts, anyone?

# Differential calculus reminder

This is just an online note to myself about differential calculus. A level maths again…

Calculating $\frac{d}{dx}2^x$

$2^x = e^u$ where $u=\ln(2^x)$

Using the chain rule: $\frac{dy}{dx} = \frac{dy}{du}\frac{du}{dx}$

$\frac{dy}{du} = e^u$ (as $\frac{d}{dx}e^x = e^x$)

$\ln(2^x) = x\ln(2)$ hence $\frac{d}{dx}\ln(2^x) = \ln(2)$.

So $\frac{d}{dx}2^x = e^{\ln(2^x)}\ln(2) = 2^x\ln(2)$

And $e$, Euler’s number, really is magical.

# Second-hand maths and computing books

Image via Wikipedia

I am nominally on holiday at the moment – though at times it doesn’t feel like it with the millstone of the MSc project report round my neck (one month till deadline today). But the weather is good – I am typing this outside in the Sun – and the setting is lovely (Hay-on-Wye in the Welsh borders).

Hay is famous for its second-hand bookshops and there are certainly plenty of them in the town. I was here before, in 1997, and more than ever the place seems to exude comfort and prosperity: a huge contrast to Wigtown in Scotland, where I visited last summer and which aims to be Scotland’s book town in the same way that Hay is for Wales (as Hay sits right on the border – it’s about 400 metres from where I am writing this – many people seem to think it is in England too).

Wigtown feels like a place that has opted to be a “book town” because it has tried everything else. Hay may have been like that once, but with its good pubs and restaurants and genteel charm it has moved beyond that now.

But one thing they have in common is a general dearth of maths and science books in their shops. To be sure, Hay has more shops and more books on sale than Wigtown, but the science proportion is just as low (ie close to zero).

There are some bargains to be found – I bought a good quality hard-back copy of The Honors Class: Hilbert’s Problems and Their Solvers for much less than it costs in paperback: assuming my interest in computability survives completing the degree, that will fill some autumn nights.

But there is not much. Having set up the telescope I had hoped I would be able to find a second hand copy of Norton’s Star Atlas: but if it’s out there is well hidden.

Similarly I cannot really find any decent books on statistics – I want something between the simple “Maths for Dummies” type and the hugely advanced “Latest developments in analysis” that can be found here and there.

For computing the story is even worse. If you have some ancient Windows 3.1 package or are writing stuff in Delphi then maybe you might find something here, but it seems that computer users and programmers do not sell-on books that refer to anything useful.

To be fair: I should add that Wigtown is not a bad place to spend an afternoon – and in general Galloway is a good place to go on holiday.

# Reflections on the riots: part one

Image via Wikipedia

This is a blog about computing (along with some maths and science) – and not about politics, and having disciplined myself to stick to that for the last nine months, I intend to keep it that way, even as I write about the biggest political event of the year.

But I will allow myself two short political observations: firstly, that disrespect for the law and contempt for order are not new things in London. If you read Albion’s Fatal Tree you will see that there have long been many in the capital who have made their at least part of their livelihoods from criminality and who celebrated their fellows. Pretending that recent events represent some terrible breakdown in ancient respect for authority is ahistorical.

And, before people start to say it is the fault of rap music or other “alien” influences, do they remember this? Perhaps the Fast Show is the real cause of the disorder?

So, that over, what is the science point? Well, it was consistently reported during last week’s disturbances that the looters were sharing their intelligence through BlackBerry smart phones, specifically through “BlackBerry Messenger” (BBM). Given that the UK has one of the most sophisticated signals intelligence set-ups in the world at GCHQ, the fact that the police were clearly left in the halfpenny seats by the looters suggests to me that nobody there has yet proved that P=NP or developed an algorithm to crack the one way functions used to  encrypt the BBMs.

According to Wikipedia Blackberry encrypt everything with “Advanced Encryption Standard” (AES). A brute force attack on this would, on average, require $2^{255}$ attempts (for the 256 bit encryption), so that is not a practical option (eg the universe is very roughly $4^{17}$ seconds old).

Now, it could be that the US government has cracked this thing and just refuses to tell even its closest ally (I dare say the name Kim Philby is still spat out in various places), but my guess is that AES is safe, for now.

As I have said before that is probably a pity: while a world where P=NP would be one where internet commerce was broken, it would also be one with many compensatory benefits.

# Goedel’s Incompeteness theorem surpassed?

Image via Wikipedia

Gödel’s Incompleteness Theorems are one of the cornerstone’s of modern mathematical thought but it is also a major blot on the mathematical landscape – as it establishes an inherent limit on the ability of mathematicians to describe the mathematical world: the first theorem (often thought of as the theorem) states that no consistent (ie self-contained) axiomatic system is capable to describing all the facts about natural numbers.

To today’s physical scientists – used to concepts such as relativity and quantum uncertainty – the broad idea that there could be an uncertainty at the heart of mathematics is maybe not so difficult to take, but it is fair to say it broke a lot of mathematical hearts in the 1930s when first promulgated. (This book – Godel’s Proof – offers an excellent introduction for the non-mathematician who is mathematically competent – ie like me!).

Gödel thought at the time that this kink in mathematical reality could be smoothed out by a better understanding of infinities in mathematics – and, according to the cover article in this week’s New Scientist (seemingly only available online to subscribers), by Richard Elwes, it is now being claimed by Hugh Woodin of UC Berkeley that just that has been shown.

Along the way, this new hypothesis of “Ultimate L” also demonstrates that Cantor’s continuum hypothesis is correct. I do not claim to understand “Ultimate L”, and in any case, as is their style, the New Scientist don’t print the proof, they just describe it in layman’s terms. I do have a basic understanding of the continuum hypothesis, though, and so can show the essential points that “Ultimate L” claims to have found.

Georg Cantor showed that there were multiple infinities, the first of which, so-called $\aleph_0$ (aleph null) is the infinity of the countable numbers – eg 1, 2, 3… and so on. Any infinite set that can be paired to a countable number in this way has a cardinality of $\aleph_0$. (And, as the New Scientist point out, this is the smallest infinity – eg if you thought that, say, there must be half as many even numbers as there are natural numbers, you are wrong – the set of both is of cardinality $\aleph_0$ – 2 is element 1 of the set, 4, is element 2 and so on: ie a natural number can be assigned to every member of the set of even numbers and so the set is of cardinality $\aleph_0$.)

The continuum hypothesis concerns what might be $\aleph_1$ – the next biggest infinity. Cantor’s hypothesis is that $\aleph_1$ is the real numbers (the continuum): I discuss why this is infinite and a different infinity from the natural numbers here.

We can show that this set has cardinality $2^{\aleph_0}$ – a number very much bigger than $\aleph_0$. But is there another infinity in between?

Mathematicians have concentrated on looking at whether any projections (a word familiar to me now from relational algebra) of the set of reals has a cardinality between $\aleph_0$ and $2^{\aleph_0}$ – if they did then it would be clear the reals could not have the cardinality of $\aleph_1$ – but some other, higher, $\aleph$.

No projections with a different cardinality have been found, but that is not the same as a proof they do not exist. But if Woodin’s theory is correct then none exist.

(Just one more chance to plug the brilliant Annotated Turing: if you are interested in computer science you should really read it! This is the book that first got me interested in all this.)

# Computer science: the worst degree?

Image via Wikipedia

Last Friday the UK’s Higher Education Statistics Authority (HESA) published data on the employment rates of graduates (of bachelor degrees) in the UK and one thing is very clear: computer scientists are less likely to be in jobs than graduates in any other broad discipline.

Just 84.7% of recent graduates from full time computer science degrees were in employment in 2009/10, compared to, say, 86% of graduates from the much-maligned “mass communication and documentation” field – that’s “media studies” to the Daily Mail et al.

In contrast 89.6% of mathematical science, the closest analogue to computer science,  graduates (full timers) were in employment.

So, perhaps this is a function of the recession and the decline in financial services employment? Well, it seems not. The low employment rate for computer science graduates is nothing new: in 2005/06 the rate was higher – 88.6% – but still the lowest of the categories listed by HESA (the ‘media studies’ rate was 91.4% and the maths rate 93.9% for that year).

So, what is the reason? Have we too many computer scientists or programmers? Seems unlikely, though it is the case that graduates from mathematics and the physical sciences are quite likely to also be competing with the computer science jobs that are available.

So are the computer science degrees a poor grounding, or are the students who pick this degree deficient in some other way? I really do not know