## Computer science in English schools: the debate rages on

In recent months a new consensus has emerged about teaching ICT (information and communications technology) in England’s schools: namely that it has been sent up a blind alley where kids are taught little more than how to manipulate Microsoft’s “Office” products.

That recognition is a good thing, though the way in which the government were finally roused into action – by a speech from a Google bigwig – was not so edifying. If the previous Labour government had a distressing and disappointing attitude of worshipping the ground Bill Gates trod upon, the Conservative wing of the coalition seems mesmerised by Google (not least because of some very strong personal and financial ties between Google and leading Conservatives).

But recognising there is a problem and fixing it are two very different things. The proposals from the Education Secretary, Michael Gove, seen contradictory at best: on the one hand he’s said we need a new curriculum, on the other he’s seemingly refused to do anything to establish one. The revelation last week that he’s axed the bit of his department that might create such a curriculum did not inspire confidence.

But the pressure for change is still mounting. In tomorrow’s Observer John Naughton, author of the celebrated A Brief History of the Future: Origins of the Internet – launches his manifesto for ICT (as it’s a manifesto I have copied it in full, but you should really also read his article here):

1. We welcome the clear signs that the government is alert to the deficiencies in the teaching of information and communications technology (ICT) in the national curriculum, and the indications you and your ministerial colleagues have made that it will be withdrawn and reviewed. We welcome your willingness to institute a public consultation on this matter and the various responses you have already made to submissions from a wide spectrum of interested parties.

2. However, we are concerned that the various rationales currently being offered for radical overhaul of the ICT curriculum are short-sighted and limited. They give too much emphasis to the special pleading of particular institutions and industries (universities and software companies, for example), or frame the need for better teaching in purely economic terms as being good for “UK plc”. These are significant reasons, but they are not the most important justification, which is that in a world shaped and dependent on networking technology, an understanding of computing is essential for informed citizenship.

3. We believe every child should have the opportunity to learn computer science, from primary school up to and including further education. We teach elementary physics to every child, not primarily to train physicists but because each of them lives in a world governed by physical systems. In the same way, every child should learn some computer science from an early age because they live in a world in which computation is ubiquitous. A crucial minority will go on to become the engineers and entrepreneurs who drive the digital economy, so there is a complementary economic motivation for transforming the curriculum.

4. Our emphasis on computer science implies a recognition that this is a serious academic discipline in its own right and not (as many people mistakenly believe) merely acquiring skills in the use of constantly outdated information appliances and shrink-wrapped software. Your BETT speech makes this point clearly, but the message has not yet been received by many headteachers.

5. We welcome your declaration that the Department for Education will henceforth not attempt to “micro-manage” curricula from Whitehall but instead will encourage universities and other institutions to develop high-quality qualifications and curricula in this area.

6. We believe the proper role of government in this context is to frame high-level policy goals in such a way that a wide variety of providers and concerned institutions are incentivised to do what is in the long-term interests of our children and the society they will inherit. An excellent precedent for this has in fact been set by your department in the preface to the National Plan for Music Education, which states: “High-quality music education enables lifelong participation in, and enjoyment of, music, as well as underpinning excellence and professionalism for those who choose not to pursue a career in music. Children from all backgrounds and every part of the UK should have the opportunity to learn a musical instrument; to make music with others; to learn to sing; and to have the opportunity to progress to the next level of excellence.” Substituting “computing” for “music” in this declaration would provide a good illustration of what we have in mind as a goal for transforming the teaching of computing in schools. Without clear leadership of this sort, there is a danger schools will see the withdrawal of the programme of study for ICT in England as a reason for their school to withdraw from the subject in favour of English baccalaureate subjects.

7. Like you, we are encouraged by the astonishing level of public interest in the Raspberry Pi project, which can bring affordable, programmable computers within the reach of every child. But understanding how an individual machine works is only part of the story. We are rapidly moving from a world where the PC was the computer to one where “the network is the computer”. The evolution of “cloud computing” means that the world wide web is morphing into the “world wide computer” and the teaching of computer science needs to take that on board.

8. In considering how the transformation of the curriculum can be achieved, we urge you to harness a resource that has hitherto been relatively under-utilised – school governors. It would be very helpful if you could put the government’s weight behind the strategic information pack on Teaching Computer Science in Schools prepared by the Computing at School group, which has been sent to every head teacher of a state-maintained secondary school in England to ensure that this document is shared with the governors of these schools.

9. We recognise that a key obstacle to achieving the necessary transformation of the computing curriculum is the shortage of skilled and enthusiastic teachers. The government has already recognised an analogous problem with regard to mathematics teachers and we recommend similar initiatives be undertaken with respect to computer science. We need to a) encourage more qualified professionals to become ICT teachers and b) offer a national programme of continuing professional development (CPD) to enhance the teachers’ skills. It is unreasonable to expect a national CPD programme to appear out of thin air from “the community”: your department must have a role in resourcing it.

10. We recognise that teaching of computer science will inevitably start from a very low base in most UK schools. To incentivise them to adopt a rigorous discipline, computer science GCSEs must be added to the English baccalaureate. Without such incentives, take-up of a new subject whose GCSE grades will be more maths-like than ICT-like will be low. Like it or not, headteachers are driven by the measures that you create.

11. In summary, we have a once-in-a-lifetime opportunity to prepare our children to play a full part in the world they will inherit. Doing so will yield economic and social benefits – and ensure they will be on the right side of the “program or be programmed” choice that faces every citizen in a networked world.

## Why you cannot always trust social media: a practical example

Reading more about the excellent Red Plenty – I came across this discussion on the US blog “Crooked Timber”.

There are some very odd contributions there – one commentator in particular, Louis Proyect, waxes on and on and in ignorance of the book’s real content (he has not read it, he says), about its flaws… ignore that and read a copy, it’s brilliant.

But the discussion also contains something worse than the at-length views of a loud-mouthed know-it-all –  a link to an Amazon.com page for ‘Khrushchev Lied: The Evidence That Every “Revelation” of Stalin’s (and Beria’s) Crimes in Nikita Khrushchev’s Infamous “Secret Speech” to the 20th Party Congress of the Communist Party of the Soviet Union on February 25, 1956, is Provably False’.

My interest here is not in the likely psychotic state of the author (I am assuming that Grover Furr did not write such a ludicrous book as a way of making fun of Stalinists) – but that on Amazon.com it has 6 five star reviews – the first of which has been found “helpful by 26 out of 29 readers” (it was out of 28 before I got there), and all of which are rated positively.

In other words a cult of Stalin freaks, people who lust for the GuLags and Nazi-Soviet pacts, revel in anti-Semitism and mass deportations, famine and slaughter, have been able to fix up the Amazon social media recommendation system without many people spotting what they were up to.

So … (a) remember this when you next rely on nothing but a social media recommendation to make a purchase and (b) go there and vote these reviews as unhelpful. That is the least any of us can do to honour the many millions of victims of Stalinism.

## Why isn’t the universe of infinite density?

Another speculation produced by Brian Greene’s The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos.

Imagine the universe was infinite (along the “quilted multiverse” pattern – namely that it streched on and on and we could only see a part). That would imply, assuming that the “cosmological principle” that one bit of the universe looked like any other, applied, that there were an infinite number of hydrogen atoms out there.

So, why is the universe not of infinite density? Because surely Shroedinger’s Equation means that there is a finite probability that electrons could be in any given region of space? (Doesn’t it?)

For any given electron the probability in “most” regions of space is zero in any measurable sense. But if there are an infinite number of electrons then the probability at a given point that there is an electron there is infinite, isn’t it?

OK, I have obviously got something wrong here because nobody is dismissing the “quilted multiverse” idea so simply – but could someone explain what it is I have got wrong?

Update: Is this because space-time is a continuum and the number of electrons a countable infinity?

## Cosmologists’ problems with aleph-null and the multiverse

This is another insight from Brian Greene’s book The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos – well worth reading.

Aleph-null ($\aleph_0$) is the order (size) of the set of countably infinite objects. The counting numbers are the obvious example: one can start from one and keep on going. But any infinite set where one can number the members has the order of $\aleph_0$. (There are other infinities – eg that of the continuum, which have a different size of infinity.)

It is the nature of $\aleph_0$ that proportions of it are also infinite with the same order. So 1% of a set with the order $\aleph_0$ is also of order $\aleph_0$. To understand why, think of the counting numbers. If we took a set of 1%, then the first member would be 1, the second 101, the third 201 and so on. It would seem this set is $\frac{1}{100}^{th}$ of the size of the counting numbers, but it is also the case that because the counting number set is infinite with order $\aleph_0$, the 1% set must also be infinite and have the same order. In other words, if paradoxically, the sets are in fact of the same order (size) – $\aleph_0$.

The problem for cosmologists comes when considering the whether we can use observations of our “universe” to point to the experimental provability of theories of an infinite number of universes – the multiverse.

The argument runs like this: we have a theory that predicts a multiverse. Such a theory also predicts that certain types of universe are more typical, perhaps much more typical than others. Applying the Copernican argument we would expect that we, bog-standard observers of the universe – nothing special in other words – are likely to be in one of those typical universes. If we were in a universe that was atypical it would weaken the case for the theory of the multiverse.

But what if there were an infinite number of universes in the multiverse? Then, no matter how atypical any particular universe was (as measured by the value of various physical constants) then there would be an infinite number of such a typical universes. It would hardly weaken the case of the multiverse theory if it turned out we were stuck inside one of these highly atypical universes: because there were an infinite number of them.

This “measure problem” is a big difficulty for cosmologists who, assuming we cannot build particle accelerators much bigger than the Large Hadron Collider, are stuck with only one other “experiment” to observe – the universe. If all results of that experiment are as likely as any other, it is quite difficult to draw conclusions.

Greene seems quite confident that the measure problem can be overcome. I am not qualified to pass judgement on that, though it is not going to stop me from saying it seems quite difficult to imagine how.

## The Copernican principle and the multiverse

Thinking about this leaves my mind in a bit of a twist, but it is worth exploring.

I am still reading Brian Greene’s The Hidden Reality: Parallel Universes and the Deep Laws of the Cosmos: a great book (just enough maths in the footnotes to make me feel I haven’t completely lost touch yet with a clear narrative in plain English in the body).

In the book there is, understandably enough, a fair bit of discussion of the “cosmological constant” – an anti-gravitational force that is powering the universe’s expansion.

It turns out that this force is just about the right value to allow galaxies to form (if it were too high then gravity would not be able to overcome it, if it were too low then gravity might just throw everything into one lump or a black hole). Without galaxies, goes the reasoning (after Steven Weinberg), there would be no life – as galaxies allow the mixing of various elements (eg everything on the Earth that comes higher in the periodic table than iron was manufactured in a supernova, while everything that is heavier than helium surely got here in the same explosive way – we are not so much what stars are made of as opposed to being made of stars.)

But there are about $10^{124}$ different values of the cosmological constant that could have a measurable effect on our universe’s physical laws, argues Brian Greene and essentially demands that, via the Copernican Principle (that humans are not at the centre of the universe) that requires there to be approximately (in fact, rather more) that number of universes out there to show that our universe, with its physical laws (or, more accurately, its physical constants – the laws being immutable) is just another typical drop off point.

And, happily for Greene, he points out that string theory allows for about $10^{500}$ universes, so it is perfectly possible for this one, with its particular cosmological constant, to be just typical.

But, while I understand this argument and, of course, it has a beauty and is perhaps the ultimate vindication of Doctor Copernicus, it also seems to me to be flawed. There seems to me to be no need to demand these additional universes. Because we can only observe the universe we are in. Were there to be only one universe (I know that term is technically a tautology, but I hope you understand the point) and it had different physical characteristics we simply would not be around to see it.

The fact that our universe has a particular set of characteristics and we can see it seems to me to prove or demand nothing very much (ie., I am not making some argument in favour of a “grand designer” either) – other than we have “won” a physical lottery. We exist because of the physical characteristics of the universe, not the other way round, which it seems to me is quite close to what Greene demands.

## No (well, not much) kernel hacking on a Sunday

These days it is possible to host the Linux kernel on GitHub and their tools reveal some interesting things about the pattern of kernel hacking (or at least of kernel committing.)

The “punchcard” tool shows what times commits are made. And here it is for the Linux kernel:

It seems that kernel hacking is pretty much a 9 – 5 week-a-day task, though with a bit of extra stuff in the evenings – pretty much what one would expect from a team of office workers.

With more and more additions to the kernel coming straight off git pulls, this pattern must reflect rather more than Linus Torvalds‘ own office habits.

It looks like the image of kernel hackers as nerds pulling all-nighters with the help of “rotary debuggers” (see Hackers: Heroes of the Computer Revolution for more of that) is well past its use-by date: building Linux is just a job.

## Mild roasting delivered

Well, I didn’t get the worst-of-ever kickings from my posting to LKML, and I did get some pointers as to how to improve the VMUFAT code, which I am now working on amidst anything else I am doing in my life.

## Touch paper lit…

Have posted the VMUFAT code to LKML – see here and similar.

## Pushing to a remote git over a non-standard port

It took me a while to work out how to do this, so I thought I should write it down for my own use and for anybody else who wants it.

I have my SSH daemon running on a non-standard port and wanted to push to my git repository.

The key is to add the repository, like so:

git remote add reponame ssh://username@server:port/repo/location/.git

Then pushing is as simple as:

git push reponame localbranch:remotebranch

## A meme you can believe in

Update: Thanks to Mary Wimbury (@marywimbury) for pointing out that this was, of course, someone making fun of anti-equal marriage protestors – at the Massachusetts constitutional convention. Not even homophobes are likely to complain about beating the Nazis, are they?