The previous post on the blog got a good showing on Hacker News yesterday (I made +19 on my submission, so not outstanding, but not bad).

So it’s a good opportunity to compare Hacker News’s impact on a webiste with Slashdot – and Slashdot still comes out on top.

The 4000 or so additional visitors I got yesterday fall a long way short of the 15,000 – 30,000 I would expect if I got on Slashdot and the fall from the peak is much quicker (perhaps if I had got enough +1s to get up to the top of the first page that might be different).

Thanks to Slashdot and a couple of other places, this last week has seen the blog visited by close to 60,000 browsers – an impressive figure in anyone’s book. But almost all (about 50,000) of that was compressed into last weekend and at the peak there were perhaps 150 new visitors each minute.

Now, I know for personal experience that, with good design and testing, a dedicated web server on a machine with enough memory could probably handle that – but I also know it is difficult to guarantee it. WordPress.com handled it all with ease, so although I am not likely to have such a busy period again any time soon, I will be sticking with the free service.

I never managed to get the thing into mainline – indeed the battering I got last time I tried, in 2009, more or less made me give up writing anything for the kernel and the Dreamcast was put away.

I am not pretending my code was pretty or even particularly good but it is no wonder people get put off from contributing when you get pig ignorant comments like these:

Everything about the on-disk format is tied to the VMU. Until that sinks in, don’t bother sending me email, thanks.

This was someone, who ought to have known better, claiming that it was not possible to have a free standing filesystem for this at all – though they were making their, incorrect, claim in the manner seen all too frequently on the Linux Kernel Mailing List.

No comments. Really. There must be some limits on the language one is willing to use on public maillist, after all.

As you can tell this person – a top flight Linux hacker – did not like my code. And, looking back, I can hardly blame him, it was pretty ugly. But as a help to fix it this is of next to no use – and only serves to demotivate. Nasty computer scientists, again.

Ok, so I have got that off my chest. And I am once more placing myself in the firing line.

The filesystem code, a work in progress (so check where the code has got to once you click the link), can be found here. A filesystem that you should be able to mount under loopback, can be found here. All testers welcomed with open arms.

I raised this issue with faculty staff in Birkbeck a year ago and got a very vague answer – which sounded pretty much like “the university has copyright in your work but we won’t enforce it.”

I honesty do not know whether that is true or not and, in fact, the phrase I was required to apply to my MSc project report – “the report my be freely copied and distributed provided the source is explicitly acknowledged” – would represent a breach of the GNU GPL if the university sought to apply it to the Linux patch code which currently sites in my Git repo. (Of course we can argue as to whether or not I have actually distributed anything, and the answer is probably no).

I was left slightly dissatified by the official response to my question, because it suggested the college did not really take the issue seriously, which, given that it is at the heart of some long-running and extremely important debates in software development and distribution seems odd. Though, to be fair, I think the concern was more that someone would try to “go proprietary” on their software rather than allow it to be shared as part of the academic commons.

I would share that concern, of course. But it would not take much effort to allow students to pick from a range of licensing terms that would both protect the college’s desire to share and reuse any software or ideas present in the work while ensuring that other, necessary, licensing constraints are met.

Been an interesting week here on the blog … as I said before getting on Slashdot is great as you get access to the intellect of a lot of people, and that taught me quite a bit.

It’s not relevant to my course but I went to the (Maths shelves of) the Birkbeck library and withdrew The Language of Machines – one of the few books on computability in the library.

But before I get down to reading much of that I think I need to reply to some of the harsher remarks that were made about this blog on Slashdot – in particular the piece I wrote at the end of December called “What if P=NP?”

The opening line of that article is “I admit I now going slightly out of my depth” and it does contain a few errors, though nothing I think that justifies some of the really nasty things a couple of people wrote about it – but then again some people also insisted that proving P=NP would not break public key encryption so plainly there are people on /. whose ignorance is matched by the vehemence of their opinion, so maybe I shouldn’t let it get to me: especially as one of the critics insisted there were no algorithms to solve NP problems (so how are they solved? Fairy dust inside your computer?)

But I thought I’d try to rewrite my blog post in a better way….

I admit I now going slightly out of my depth, but I will try to explain what this is about and why it is interesting.

It is known that computers can solve some problems in what is called “polynomial time“: that is to say a finite time that is proportional to a polynomial of complexity of the input. The key thing is that these problems are computable using known mechanical steps (ie algorithms). Often that means they are solvable in a way that is (sometimes euphemistically) described as “quickly”.

These can be simple problems – like what is the sum of the first ten integers – or more complex ones, such as creating self-balancing trees, sorting database records alphabetically and so on.

These are the so-called “P” (for polynomial time class) problems. Here’s a definition:

A problem is assigned to the P (polynomial time) class if there exists at least one algorithm to solve that problem, such that the number of steps of the algorithm is bounded by a polynomial in , where is the length of the input.

Then there are another class of problems which seem fiendishly difficult to solve but which it is relatively simple to prove the correctness of any solution offered. These problems can also be solved (computed) in a finite time – and they can also be computed by a Turing machine (a simple model of a computer) and so an algorithmic solution exists. It is just that one cannot tell in advance what that algorithm is. In the worst case the only way – it is thought – that a solution can be found is through an exhaustive search of all algorithms – in other words a form of “brute force“. These are the NP (non deterministic polynomial time class) problems.

Now most mathematicians think that NP does not equal P – ie there are no “simple” solutions to the NP problems – and that may or may not be a good thing as much of our internet commerce relies on encryption which is thought to be an NP problem and effectively impervious (due to the time it would take to factor the very large prime numbers at the heart of the process) to brute force attacks if properly done.

(In Afghanistan in 2001 US cryptanalysts seemingly brute forced a Taliban Windows NT box but it used much weaker encryption than most electronic commerce.)

But what if it were the case that all seemingly NP problems were actually P problems? There are a lot of people studying this issue – but according to the New Scientist (their Christmas edition, the first in my subscription and delivered this morning, inspired me to write this) we should expect to wait at least until 2024 for an answer (by which point the problem – first formulated in 1971 – will have reached the age at which 50% of major mathematical problems will have been solved).

This all matters: unlike, say, Fermat’s last theorem, proving P = NP is likely to have more or less immediate effects on the lives of many of us (how would you feel if you knew that it was possible, even if not likely, that someone had developed a way to quickly crack open all your internet credit card transactions?)

Of course, proving P = NP for all problems does not necessarily mean we will immediately have determined polynomial time based solutions for all the current NP problems out there. But we should not expect that to last for long, though it may be the case that the algorithms may be exceptionally complex rendering them very slow even on the fastest computing equipment, but if history teaches us anything it is that computers get faster (yes, I know there are bounds on that).

And, actually, I think the benefits to humanity would be enormous. The most likely immediate effects would be in improvements in computing/operating system efficiency. fast computers for less money and/or less energy consumption would be a huge benefit. From that alone many other benefits will flow in terms of information availability and universality.

But there could be many, many others in medicine and linguistics and even in getting the trains to run on time!

Last night I submitted a story on this blog – about P=NP – to Slashdot and today the submission was accepted and in a few hours I have had perhaps five times more visitors than in the previous two and a half months.

But, thanks to WordPress.com there was no “slashdotting” – instead the site has stayed up and running throughout.

Slashdot has a reputation for flame wars and ignorance, but actually the discussion has been great – I wish I could get a few more articles from the blog on to /. as then I would benefit from the clever people who hang out there: yes there is a lot of noise, but there’s also good signal.

The other thing it has reminded me of is the need to avoid imprecision in scientific writing. A few flames have been aimed at me and articles on the blog: one item being called horrible when, actually, what I think what was really wrong was a few corners cut in an effort to make the exposition a little less cluttered.