Computer science in the UK: in the wrong direction?

Standard
Servers designed for Linux

Servers designed for Linux (Photo credit: Wikipedia)

Two big thoughts strike me as a result of the literature review I have just completed for my PhD:

  • Linux is not the centre of the universe, in fact it is a bit of an intellectual backwater;
  • The UK may have played as big a role in the invention of the electronic computer as the US, but these days it is hardly even in the game in many areas of computing research.

On the first point I am in danger of sounding like Andy “Linux is obsolete” Tanenbaum – but it is certainly the case that Linux is far from the cutting edge in operating system research. If massively parallel systems do break through to the desktop it is difficult to imagine they will be running Linux (or any monolithic operating system).

In fact the first generation may do – because nobody has anything else right now – but Linux will be a kludge in that case.

Doing my MSc which did focus on a Linux related problem, it seemed to me that we had come close to “the end of history” in  operating system research – ie the issue now was fine tuning the models we had settled on. The big issues had been dealt with in the late 60s, the 70s and the 80s.

Now I know different. Operating systems research is very vibrant and there are lots of new models competing for attention.

But along the way the UK dropped out of the picture. Read some papers on the development of virtual memory and it will not be long before you come across the seminal experiment conducted on EMAS – the Edinburgh University Multiple Access System – which was still in use when I was there in 1984. Now you will struggle to find any UK university – with the limited exceptions of Cambridge and York (for real-time) – making any impact (at least that’s how it seems to me).

It’s not that the Americans run the show either – German, Swiss and Italian universities are leading centres of research into systems software.

I am not sure how or why the UK slipped behind, but it feels like a mistake to me – especially as I think new hardware models are going to drive a lot of software innovation in the next decade (well, I would say that, wouldn’t I?)