Does “coding” have a future?

Today is, for me, the last working day of the year and I was able to finish with a small triumph – successfully solving several programming conundrums that have eaten into my time over a number of weeks.

The technology involved – Python – is not one I have had much experience with, and only this morning I was wondering if I’d just bitten off more than I could chew. So getting (the test system) Jenkins to “go green” with all the required functionality in place, feels like a little big deal where I am sitting.

It also reminds me of one of my favourite school days – fifty years ago this week. That term I had started at a new school – because my old one was successively occupied by the British Army and then by the families the British Army displaced when they moved out of the school. Nineteen seventy-two was the maddest, most awful, year of “the Troubles” and West Belfast was right at the heart of it.

I didn’t mind moving, I wasn’t all that happy at the old school and my new “P3” (the same age as American first graders) teacher Mrs MacManus was wonderful.

That week she had held an “essay” competition and we were all challenged to write a Christmas story. On the final day of term she handed out prizes to the top three (in reverse order). By the time she’d announced who came second all my hopes were gone – I was surely not going to be better than those boys. But – as I’m sure you’ve by now guessed – I won, and got a stocking full of sweeties to take home along side the sense of achievement. The flood of happiness as I walked out of class (a half day) immediately afterwards was enormous.

If I had an ambition for the future at that time, though, it wasn’t to be an essayist or any sort of professional writer, but to be some sort of “scientist”. Never made it, of course (I don’t really think that computer “science” is a science at all, as opposed to a form of applied mathematics), and, in fact, I spent much more of my working life being dependent on writing ability – for good or ill – than anything else.

Until, that is, three years ago, when having finally completed my computer science PhD, I decided I had, actually, to get a job writing computer programs.

I was exceptionally lucky: what became my employers described my job interview in the otherwise deserted office as the last face-to-face meeting they were likely to have with anyone for some time. The official lockdown had not yet begun but already the drawbridges were being raised.

Occasionally since then friends express a mixture of wonder and puzzlement at the change of tack – often because it feels to them, I think, that I’ve entered some realm of immense complexity that is simply beyond their understanding. My retort that programming a computer – at a basic level – isn’t really a higher skill seems to cut little ice.

I don’t mean that anyone could write good programs, but I do think that most people could write a program after not much study. While algorithm is much over-used word, an algorithm is actually a set of simple steps towards an end, and it was no accident that Turing’s seminal model of computing was based on the very simple idea of reading and writing a character at a time on a strip of paper.

This is why I do think that, in the next decade and maybe even sooner, we will see AI systems make big inroads into writing computer programs. In environments where computing power and space is relatively plentiful and timing is not a factor in whether a computation succeeds or fails, there is simply no reason to expect otherwise.

This simple “coding” is not a art of dark magic, in fact much of it is repetitive “boilerplate” to set up initial conditions for a routine calculation along with “CRUD” to manage the record keeping.

If anything it is the need for so much of this routine stuff that means so many people can make a living out of writing code without actually being necessarily all that good at it. And this is where AI – which in many cases is just going to copy and marginally adapt stuff other people have already written – will cut through like a knife through butter.

Is that awful? A disaster? No. At least not necessarily.

The route to human happiness lies in diminishing the realm of necessity and giving each and every one of us more free time and resources to do nicer and better things. And there is nothing particularly nice about writing page after page of ‘getter and setter’ methods for a Java class.

The issue, of course, is the distribution of the benefits that will come from the change. If it all goes into the pockets of Jeff Bezos, Elon Musk or even Bill Gates that would indeed be bad.

(The days when I thought Bill Gates was the emperor of the evil Windows empire trying to crush us plucky Linux rebels are gone: my compliant is not that Gates is a bad person, he’s demonstrably the opposite, but that no-one should be able to decide so much simply by dint of having money alone.)

But if, instead, the time and money liberated are spread more fairly – not least to train more people to be masters of the opportunities computing power gives us, and not just code monkeys – then it represents classic progress.

What is wrong today is not that technology is destroying livelihoods but that too few people are being allowed to benefit from the changes technology brings.

Advertisement
%d bloggers like this: