Been tidying up at home and it was time to admit my days as a Dreamcast hacker were over – my last posting to LKML, a filesystem driver for the File Allocation Table (FAT) filesystem found on the DC’s “Visual Memory Unit” flash device (called VMUFAT), didn’t even elicit a response.
The BBA is a rare item and cost me about £100 plus import duties back in 2001 and they still seem to sell for high prices, but I started my auction at 99p – if there is another hacker out there who is still beavering away at their DC maybe they’ll be able to pick it up cheaply (of course I want as much as I can get!).
The big advantage of the BBA is that it allows you to do things like mount NFS volumes on your Dreamcast and so is much more flexible for a developer.
I am a bit sad about it but it’s the sensible thing to do.
A few weeks ago I attended the morning (I had to go back to work in the afternoon) of the BCS doctoral consortium in Covent Garden in London – watching various PhD students present their work to audience of peers.
The presentation which most interested me was that of Srikanth Cherla who is researching connectionist models for music analysis and prediction and to use generative models to produce short passages of music that are in a similar style to the music passages that his systems learn.
It’s not a field that I have any expertise in or indeed much knowledge of, though in essence (I hope I get this right): a specialised form of neural network is used to analyse musical passages (Bach’s chorale works were highlighted) and from there it is possible to get the computer to play some passages it has composed based on the style it has learnt.
Srikanth emphasised that it was not a case of applying a rigid rule that guessed or picked the next note – there is a semi-random/stochastic element that can be attributed to certain musical patterns in the works of the great composers and capturing that is important.
And the music he played at the end – while plainly not matching Bach, did certainly sound like Bach.
Today, prior to writing this blog I read through Turing’s October 1950 paper “Computing Machinery and Intelligence“, from which we get the idea of a “Turing Test” (though obviously he doesn’t call it that).
The paper begins:
I propose to consider the question, ‘Can machines think?’
And goes on to discuss ways in which it might be possible “by the end of the century” to have machines which could fool a remote observer, able only to read typed answers to questions, that a digital computer was in fact a person.
The paper is not, for Turing at least, in a completely different field to “On computable numbers”: Turing’s essential point is anything a human computer can do, a digital computer can do, and he goes on to explicitly call humans machines.
The idea that great works of art, such as the “next” set of Bach chorales, might in the future be composed by computer no doubt horrifies many readers, as it plainly did in Turing’s day too – as he deals specifically with what he calls “the theological objection” – an extreme objection based on the idea that “God gives an immortal soul to every man and woman, but not to any other animal or machine”:
I am unable to accept any part of this… I am not very impressed with theological arguments whatever they may be used to support
But in any case, from within the theological paradigm dismisses it as a human imposition on what is meant to be an unlimited Godly power:
It appears to me the argument quoted above implies a serious restriction on the omnipotence of the Almighty
…before going on to swat aside Biblical literalism as an argument by citing how it was used against Galileo (maybe there are still fundamentalists out there who believe in the literal truth of Psalm 104 and an unmoving Earth but if so they keep quiet about it).
Then he deals with the argument that machines could not appear human because they have no consciousness by essentially asking what is consciousness anyway – and how can we prove others have it and then goes on to deal with “various disabilities” – such as computers being unable to appreciate the taste of strawberries with cream:
The inability to enjoy strawberries and cream may have struck the reader as frivilous. Possibly a machine might be made to enjoy this delicious dish, but any attempt to make one would be idiotic. What is important about this disability is that it contributes to some of the other diabilities e.g. to the difficulty of the same kind of friendliness occurring between man and machine as between white man and white man, or between black man and black man.
This passage is worth quoting as it both suggests that Turing is far from the 100% progressive superhero later admirers are tempted to paint him as – he beat the Nazis and was persecuted as a gay man and therefore can do no wrong: in fact he was a man of his times with all that implies – as well as because I find it less than fully satisfying an answer.
In context I think the point he is seeking to make is that we could make a machine that liked “eating” strawberries and could be friends with its fellows (so long as they had the same skin colour, don’t lets get too radical!) but why would we bother… but it is not totally clear.
Similarly he, like Hofstadter, deals with the so-called Goedalisation argument less than satisfactorily: this states that we, humans, can state true statements about numbers that machines cannot determine (i.e. we know they are true but the machine cannot decide if they are true or false). Hence we could, in the imitation game, pose a Goedel Number type puzzle that the computer could never answer.
Actually, of course, the computer could guess, as humans often do! But the more general point – that humans can do something machines cannot and so we are not truly Turing Machines seems unanswered to me by both Turing and Hofstadter’s argument: that we can also find questions humans cannot determine if we make them complex enough.
Perhaps an expert would care to comment?
Update: Following some feedback from Srikanth I have edited the passages referring to his work slightly – haven’t changed the sense I think but just made it a bit clearer. I also updated the Psalm number – as I had misread Turing’s reference to line 5 of the Psalm for the Psalm itself
In a programme as complex as universal credit, which includes new IT developments and changes to existing IT assets, both agile and waterfall methods may be appropriate at different times. As examples, initial development used agile techniques while, in its final stages of testing for the pathfinder from April 2013, the programme is using the waterfall approach—a standard DWP testing methodology.
This is the answer of Mark Hoban, Minister of State at the DWP, in a written answer to a parliamentary question delivered on 12 March and seemingly unnoticed by most people except Computer Weekly – who have hit the nail very firmly on the head:
The DWP gave up using the “agile” method of software development for Universal Credit, the coalition government’s flaghsip reform programme, last month.
It had before now repeatedly claimed agile was the way it would keep Universal Credit on track. The coalition government had meanwhile singled agile out as a major part of its flagship strategy to stop IT projects going calamitously and expensively wrong.
The Major Projects Authority – part of the Cabinet Office – was going to press-gang government departments to use agile methods on big software builds. Universal Credit was the government’s first – and immensely ambitious – big stab at agile…
…The u-turn raises questions about whether the government was wrong to pin its hopes on agile as the way to crack the government IT problem: did DWP ditch agile because it didn’t work? Was agile not all it was cracked up to be?
Or did DWP make such a bodge of agile that Universal Credit is now likely to be a bodge as well. Did failure compel it to fall back quickly to the well-worn waterfall path in an attempt to get things back on track?
A DWP spokesman said neither case was true.
“Just because we are not using agile doesn’t mean agile is inherently flawed, or that Universal Credit is inherently flawed,” he said. “It probably means agile is at a point where agile is not appropriate.”…
…Then last year it started to look like DWP was going to manage only a token roll-out of Universal Credit in October 2013. That’s when Duncan Smith came out with the most complete description of to agile software development, and the most clear commitment to it that, really, anyone in the software community could ever hope to imagine. Universal Credit wasn’t just agile. It really was agile. And it would remain so to the bitter end of the roll-out in 2017.
“There is a lot of ignorance at the moment in the media… saying, ‘You are not going to be ready on time’. The truth is the time that we deliver this is 2017. So that is over four years. We start that process in October,” Duncan Smith told the Work and Pensions Committee on 17 September 2012.
“The whole point about the agile process – which I find frustrating at times, because we cannot quite get it across to people – is to understand that agile is about change.
“It is about allowing you to get to a certain point in the process: one leap – check it out, make sure it works.
“And as you go into the next leap, you come up with something that says, ‘We can rectify some of the issues in this, and make that even more efficient’.
“You are constantly rolling forward, improving and making more efficient things. There is a constant retrospective change that goes on to complete that system, and that is what will happen all through those four years,” he pledged.
I think I need to give a little bit more background on the politics of the decision by the DWP to trumpet its use of “agile” methods and how, bluntly, the department has misused to potential of agile to give it cover in its huge gamble with public money and the living standards of millions of the least well off.
Of course we have to start from the basic fact that most software projects – whether they are in the public or private sector – fail. The failure could be relatively small – a budget overshoot or a lack of sought for capability. Or the failure could be huge – your space craft blows up on launch, your ambulances are never dispatched and people die and so on (these last two are real and will be familiar to almost anyone who has done a development methodologies course).
The 1997 – 2010 Labour government had some successes in software driven projects – the UK Passport Service, for instance, is now much more efficient. But it also had a fair number of high profile failures – especially in its efforts to modernise computer use in the health service (though the major ambulance dispatch failure was not in this time, but a software update did fail) – especially attempts to create a single patient electronic record.
The then Conservative opposition used this often – David Cameron in particular repeatedly suggesting that it was because the Labour government had tried to buy a single supercomputer to run the NHS: something he must have known was simply untrue but presumably worked well in focus groups.
So software projects were and are a hot political topic.
The new government, coming to office in May 2010 did several things to get a grip on failing projects. Firstly, they went for a good old fashioned gouge of the contractors’ margins: essentially saying cut your prices on existing contracts if you even want to be considered for future work. Secondly, they said that new software projects had to seriously consider using free and open source software to avoid proprietary lock-in and thirdly, they said that a new centralised control mechanism had to be applied to ensure that No 10 and the Cabinet Office had a grip on costs and efficiency: it was out of this that the Major Projects Authority, that has now reported UC is close to failure, came.
The three elements have generally worked well. It pains me to give this government political credit, but essentially they deserve it.
Yet UC has been allowed to escape this framework, and “agile” was the excuse given for this.
In, I think, early 2011 the Institute for Government published a report on government software projects and recommended that “agile” methods be used. They cited an experimental, relatively small scale project by the Metropolitan Police Service as an example of how agile could work successfully in government.
I attended the launch seminar which was rather more like a religious mission meeting than a serious seminar on how to get the best value for public money. The room, in one of the government’s finest buildings in St. James’s, was packed to bursting with representatives from small software houses, who saw agile as their ticket to the big time and certainly none of them were going to suggest that “world’s biggest” and “Agile” was a risky mix.
At the meeting the DWP announced that they would be using “Agile” as the basis on which UC was developed. I was only an MSc student but even I thought this looked like exactly the sort of project that the textbooks said Agile was not designed for – but I wasn’t confident enough to say that then and nobody else in the room seemed remotely interested in hearing such a thing.
But it didn’t take long once the meeting was over for people to point out that this was a high risk proposition: but it also became crystal clear that the Secretary of State in the department had decided that Agile was the secret sauce for government IT and that it, and it alone, would lead him to the promised land.
For well over a year it has been an open secret that the Treasury want to pull the plug on UC because they do not believe it can be delivered in anything like a working form to budget. And the signs are all there – essentially the project has already failed as its scope has been cut repeatedly and its final implementation date put back and back.
But the politics of the Conservative Party do not allow anyone in government to say this openly and even when the government’s own project watchdog says the system is on the brink of collapse the department come out and rubbish the assessment – even at the price of contradicting themselves.
This would be funny were it not for the fact that in just a few months millions of the poorest people in Britain will depend on this system working if they are to eat, to heat their homes and clothe their children.
Universal Credit – the amalgamation of various welfare payments into one unified entitlement which will vary in “real time” as claimants’ circumstances change – is at the very heart of the British government‘s plans to reform the welfare state. The idea is that the welfare system will “make work pay”. Once that meant it would have a shallow taper – in other words, the loss of benefit as claimants got work would be reduced: today that aim seems less clearly expressed, but that is another issue I won’t go into here.
Universal Credit is also the world’s biggest ever “agile development” software project and a massive financial and social (and hence political) risk for the government. Unless delivered on time and on budget then the consequences are grave – some of the most vulnerable people in society could be left literally destitute, with all that entails for their personal welfare and social order.
Yesterday the government – at least part of it – finally admitted in public what the rest of us have known for a long time: that the project is in deep trouble. As I have said before I have no political sympathies with the government but I do recognise that they have made a lot of progress in their handling of computing. They have opened up projects to free software – while Labour merely talked the talk but really did very little. And they have also instituted a more open review of big projects’ progress: which yesterday saw them admit UC was tottering on the brink.
Now they tried to hide yesterday’s report from the Major Projects Authority – releasing it on a Friday afternoon before a Bank Holiday and don’t even seem to have put it on their website – (NB: I am not accusing them of using terrorism as a cover because I do not think that’s true or fair) but even so it was another small step to transparency, though the failure to place it on the website is particularly manipulative.
The report placed UC in the “red-amber” category – not yet failing but very close it. The government department responsible for UC – Work and Pensions – issued rebuttals in their usual histrionic style – claiming this time that the assessment was eight months out of date – though back in September they were telling us everything was great then too: so either they were lying then or they are lying now or they have been lying all along. Other departments – especially the Treasury – are reportedly increasingly anxious about how big the pile up could be.
UC is being driven by politics, not a realistic assessment of how such a major project could be implemented. Iain Duncan Smith, the Secretary of State at the DWP, has reportedly threatened to resign more than once if his blessed baby was cancelled and the Prime Minister, who is increasingly politically weak, has not dared to call his bluff. “Agile” has been treated as a silver bullet – not as what it really is – just another design methodology – while much of what is supposed to happen with an agile software development project – especially regular and repeated testing of prototypes – has been conspicuously absent.
Some steps have been taken to try to rescue the project. The back end – the benefits calculation – has reportedly been shifted to a “waterfall” development process – which offers some assurances that the government at least takes its fiduciary duties seriously as it should mean no code will be deployed that has not been finished. The front end – the bit used by humans – is still meant to be “agile” – which makes some sense, but where is the testing? Agile is supposed to be about openness between developer and client and we – the taxpayers – are the clients: why can’t we see what our money is paying for?
To me this looks like a game of political chicken now: whose nerve will crack first – the Prime Minister’s or the Secretary of State’s? Either way millions of people could face misery.
Update: The MPA’s report is on the web – here – though they neglected to publish it on the MPA’s own webpage, making it that much more difficult to find. The report defines red/amber status awarded to UC as:
Successful delivery of the project is in doubt, with major risks or issues apparent in a number of key areas. Urgent action is needed to ensure these are addressed, and whether resolution is feasible.
The first computer I ever got my hands on was a Commodore PET – which managed 0.06 million instructions per second (MIPS). (in our case it didn’t even manage that because on that day – the last day of school in July 1980 Mr Shutler, myself and various others were so excited at the prospect of having a real live computer we just hacked at it without a clue and never managed to get to say more than “syntax error”).
At that time the world’s biggest and best computer was the Cray 1 – which managed a massive 150 MIPS: about one-tenth of the maximum computing capacity of an iPhone 5.
What separated the Cray 1 from the PET? Money. The Cray, launched in 1976, cost $10,000,000, the PET – a year younger – $1,500. But for the money (rebased to constant 1997 prices) you got 6,600 instructions per dollar per second out of the Cray and 17,000 instruction per dollar per second out of the PET.
And the comparison does not just apply to these two. This graph maps MIPS per thousand dollars from the end of the 19th century to more less today –
But here is the simple MIPS plot:
Here we can see three, and possibly four, clear series: bottom left the slow advance of mechanical calculation, the from the 40s, to today, the big iron, consistently outperforming the post-1977 rise of the “micros”, and finally the start of a new series – the embedded consumer devices like set top boxes and mobile phones.
As you may have seen the inventor of the GIF– graphics interchange format – Steve Wilhite, has said that it should be pronounced “jif” and not “gif” (with a hard-g).
More to the point – unless you are using the animated format – you should not be using a GIF at all. Either (for photographs) use a JPEG or (for line drawings or ant graphic with 256 or less distinct colours) use a PNG.
JPEGs and PNGs, when used properly, give better quality renditions from smaller files and are essentially universally supported in browsers these days.
(If you want to know more then I recommend PNG: The Definitive Guide – nominally it is out of print but you can pick up a second-hand copy on Amazon for pennies. You can also look for Image::Pngslimmer on CPAN if you are a Perl hacker – I wrote that!)
I wanted to answer you one of your comments in your post “Even if P=NP we might see no benefit“, but I saw I can’t do it anymore in that page, maybe due to problem with my internet. I was the person who claim a possible demonstration of problem “P versus NP” in my paper “P versus UP” which is published in IEEE,
I’m working in that problem as a hobby since I graduated. I sent a preprint version in Arxiv with the intention of find some kind of collaboration. I also tried to find help in another blogs. But, finally I decided to sent to a journal of IEEE a new version which differs of the preprints in Arxiv that I withdrew because it had some errors. Then, I waited and after a revision in a IEEE journal I was published in 17th August 2012.
However, I wrote this paper in Spanish and I had the same version in English. So, I decided to sent again to Arxiv, but they denied me that possibility, therefore, I used a pseudonymous. I also uploaded other papers with that name which are not so serious but reflect the work that I’m doing right now as a hobby too.
I love Computer Science and Math. I’m working right now in a project so important as P versus NP, but I do all this as a way of doing the things that I like most although my environment doesn’t allow me at all. I also tried to work with other scientists which have invited me to work with them since I published my paper in IEEE. Indeed, I don’t want to be annoying with my comments, I just searching the interchange with another people who have the capacity to understand my work, that’s all.
As a Birkbeck alumnus I have received an email from the college master, David Latchman, asking me to help promote part time higher education – and I am more than happy to do just that.
Professor Latchman writes:
Please help Birkbeck and support an important national #PartTimeMatters campaign to protect part-time higher education. In Adult Learners’ Week (18-24 May), a cross sector group including CBI, NUS, Universities UK, NIACE, WEA, UCU, Million Plus and of course Birkbeck and the OU are very keen to generate noise about some key messages:
• Part-time HE matters – there is a wealth of recent research that says that part-time HE matters for a whole variety of reasons. It supports the skills agenda and economic growth, it allows employees to upskill and reskill while still working, employers really value it, it creates opportunity for social mobility and has a significant impact on the life of the individual student
• Part-time HE was the biggest casualty of the 2012 reforms with a 40% downturn in enrolments across England. Part-time students make up a third of the undergraduate population and their future must be safeguarded
• Something must be done and part-time HE must be protected for future generations of adult and non-traditional learners HOW YOU CAN HELP
Great short blog here from John D. Cook in which he – and a commenter – show it is not such a difficult task after all – if you are clever enough to enter an odd number that doesn’t end in 5 then you have an approximately 9% chance of moving ahead.