Having trouble building the riscv64-unknown-elf- toolchain?


I was – for quite a long time today, so I thought I’d share this fix. My problem was with downloading the glibc and newlib repos.

These were marked as being at git://sourceware.org/git/glibc.git and git://sourceware.org/git/newlib-cygwin.git.

When I replaced git:// with https://, the problem was solved.

Sponsored Post Learn from the experts: Create a successful blog with our brand new courseThe WordPress.com Blog

Are you new to blogging, and do you want step-by-step guidance on how to publish and grow your blog? Learn more about our new Blogging for Beginners course and get 50% off through December 10th.

WordPress.com is excited to announce our newest offering: a course just for beginning bloggers where you’ll learn everything you need to know about blogging from the most trusted experts in the industry. We have helped millions of blogs get up and running, we know what works, and we want you to to know everything we know. This course provides all the fundamental skills and inspiration you need to get your blog started, an interactive community forum, and content updated annually.

First year as a software engineer


Today marks the anniversary of me starting work as a software engineer. I love the job – despite some of the real challenges I’ve faced in a radical career change – and I do feel so very lucky to have got it in an exceptionally difficult time.

Some of the changes are about how I see myself – after more than 30 years of working in communications and public policy I (regardless of what others thought about me or even whether I was right or wrong) was generally very confident in my own judgment and ideas. I’d been around the track – more than once – and whether you liked it or not I had a view I’d generally express. Now I am the start-of-career, not-long-out-of-university beginner. It can be daunting sometimes and I get things wrong, though my colleagues are generally happy to help (though everyone working remotely does sometimes make that a little harder).

Well, I’m not quite new to everything – I know how corporate things work and while I am nobody’s manager I do know what that is about too (or at least I think I do).

Secondly, I am very much working in an engineering environment and not a computer science one. The differences are subtle and I cannot quite articulate them, but they are certainly real. “Scientists” (whether computing scientists – applied mathematicians really – or hard scientists) and engineers do tend to look at each other a little warily, and before I’d always been on the “other” side of this. But I am getting used to this too.

Above all it’s great to work somewhere where every day I am expected to think and apply that thought to solve novel and interesting problems.

Coming soon: RISCYFORTH


When I was much younger FORTH fascinated me as an alternative interpreted language for the Z80 eight-bit processor machines I was typically using.

Compared to BASIC – the language that the Sinclair ZX80, ZX81 and Spectrum came with – FORTH was reputedly lightening fast and very powerful.

My brother and I even obtained a tape copy of somebody’s FORTH for the ZX80 and we ran it – and it certainly was fast. But it also lacked the simplicity of BASIC and the tape was soon handed back.

But I’m back on the case again, inspired by this (long out of print but widely available on the web) book – Threaded Interpretive Languages – and by the prospect of a single board RISC-V computer – the BeagleV – coming out this year.

Currently I am targeting the PK proxy kernel on the Spike Risc-V emulator for RISCYFORTH but if and when I get a real BeagleV I’ll immediately switch to that (I applied to be an early user but have heard nothing so while the signs are that the project itself is making good progress it looks like I’ll have to wait to get my hands on one.)

I struggled with getting the mechanics of RISCYFORTH right for a long time but in the last week I’ve finally started to make serious progress and it actually does things (only in immediate mode for now). The picture shows my very first success with a multi-token command line from a couple of evenings ago and it’s come on a fair bit since then.

It’s nowhere near a releasable state but it’s rapidly improving.

Why bother? Well I think it’s important that RISC-V succeeds as a disruptor of what is starting to look like an ARM monopoly and so contributing to the ecosystem of the first single board seriously affordable RISC-V device matters. And, of course, because it’s there.

Always yield to the hands-on imperative (from this classic).

Update: My brother actually thinks we borrowed somebody’s Jupiter Ace which was a Z80-based FORTH computer of a very similar size to a ZX81 – and I think he might be right.

Why I bought a fax machine


Actually, I didn’t realise I had bought a fax machine until the laser printer I knew I had bought turned up and I read on the packaging that it was also a fax machine.

(Fax is perhaps the most disruptive technology I’ve seen rise and fall in my time as an adult – I last used one in 2005 as far as I can recall but only 15 years earlier they were seen as cutting edge – but no matter…)

Are printers like fax machines in another way too? Destined to all but disappear as the tyranny of the screen grows ever stronger? The reasons that motivated me to buy the laser printer (and not just another cheap inkjet that produces shoddy output and falls apart after a few months) make me think not.

  • Paper is much more flexible than a screen – try scribbling a note on your screen and see how that goes.
  • Paper is the ‘rest energy’ form – it’s true that printing a page takes a lot more energy than clicking on a HTML link, but paper is more or less the zero energy, zero technology form of reading something – it’s generally easier to do than reading something on a screen (and if you drop a page you don’t generally risk losing you ability to read either until you buy a new set of eyes).
  • Paper’s flexibility makes it easier to see links – this is a killer application for paper in my field of software engineering (though maybe not all engineers would agree) – you can see much more information at once.
  • Too many screens aren’t really very good for reading – too many screens on small devices just aren’t very good for reading text. When we print something we generally print it at a size that’s optimised for reading.
  • Screens tire your eyes in the way that paper just doesn’t.

Getting a laser printer as opposed to an ink jet feels like a bit of an indulgence but as every other cheaper printer we’ve had over the years has generally fallen apart quickly I am hoping it is going to deliver long-term satisfaction.

Alexa, tell me what changed your mind


We were given an Echo Dot for Christmas. And it’s just brilliant.

I have to admit I was pretty cynical – my principal experience with Apple’s voice activated “Siri” is that it doesn’t understand my accent (even though you can select an Irish voice for the output, forget about it for the input.)

But this is really great. It sits in the kitchen and has essentially replaced the digital radio and the fact that you can ask it (simple) things is a bonus.

One of the best things about it is that it allows me to spend 10 – 15 minutes listening to “Morning Ireland” on RTÉ Radio 1 every morning as I eat my toast – easy access to that perspective on world events (and on what the only country with a land border with the UK thinks about what is happening here) is a great thing to have.

Cannot recommend it highly enough.

A puzzle from Donald Knuth


Recently I had to write some code to generate a pseudorandom number in a system with very limited sources of entropy. So, of course I turned to Donald Knuth and, in particular, Volume 2 – Seminumerical Algorithms – in the magisterial The Art of Computer Programming.

Reading through the questions/exercises I then came across this one:

Prove that the middle-square method using 2n-digit numbers to the base b has the following disadvantage: if the sequence includes any number whose most significant n digits are zero, the succeeding numbers will get smaller and smaller until zero occurs repeatedly.

(Knuth rates this question as ’14’ and, using his scale of difficulty which places a ’10’ as a minute to solve and a 20 as twenty minutes, probably means this should take about 7 or 8 minutes but I’ve spent much, much longer on it than that!)

A quick explanation: the middle-square method is a naive (though actually first suggested by none other than John von Neumann) random number generation method where we take a number n-digits long, square it and take the middle n-digits as our next seed or random number.

At first I thought I’d found an example where Knuth’s proposition appears to be false.

Let b=10 and the seed number be N_0 =60 then every subsequent number in the sequence is also 60 (obviously that’s as useless as repeated zeros for a random number generator.) But the problem with that is, that although it demonstrates a weakness in the middle square method, it doesn’t fit Knuth’s definition of the problem. What is n here? If n = 2, 2n=4, 4n=8 (n=2 is the minimum for middle values), then $N_o = 0060 $ and N_0^2 = 00003600 and so N_1 = 0036, N_2=0012, N_3=0001, N_4=0 (thanks to Hagen von Eitzen for clarifying this for me.)

So let’s look at the general case. (I also this explanation to Hagen von Eitzen, as compared to my very long-winded first attempt – though any errors that follow are mine not his.)

So we have a number x which we think of as a 2n digit number – though the first n digits are 0. Then x^2 < b^{2n} (as the largest x can be is b^n -1.)

Thus as the largest x can be is b^n - 1 then:

\frac{x^2}{b^n} \leqslant \frac{x(b^n - 1)}{b^n}

And \frac{x(b^n - 1)}{b^n} = x-\frac{x}{b^n}

If we have a 2n digit number x then the biggest number of digits we can get from the out put is 4n, but in our case we only have n digits to worry about so the biggest size x^2 can be is 2n digits, as again the leading 2n digits will be 0.

So, to apply the middle square method we need to lose the lower n digits – i.e., take \lfloor\frac{x^2}{b^n}\rfloor.

From the above:

\lfloor\frac{x^2}{b^n}\rfloor \leqslant \lfloor x-\frac{x}{b^n} \rfloor

If x > 0 then \lfloor x-\frac{x}{b^n}\rfloor will always < x, so the sequence decreases until x = 0.

The long haul


Far too much debate in the UK about responding to SARS-CoV-2 has been about short-termist responses. So, for instance, just as every lockdown has begun to have real effect it has been lifted in the name of the economy.

The result has been we are now in the longest lock-down of all, we’ve got the deepest economic setback of any major economy (though obviously there are other things going on to cause that too) and we have one of the highest death rates in the world. Not good.

It ought to be becoming clearer to more people that, actually, even after a mass vaccination programme (the one area where the UK has, thankfully, done well), the virus will not be gone from our lives. I don’t think there will, in my lifetime, be a return to what was fully “normal” as recently as December 2019.

Over time we can expect, as a species, to see the threat from the virus diminish as, like the common cold, more children who catch it while young grow to adulthood with a fully primed immune system. For the rest of us there will be vaccines – and there will also be mutations that may threaten our vaccine-acquired immunity.

We cannot stop dangerous mutations arising – so long as the virus is in circulation it will mutate and, if a mutation improves the virus’s ability to evade vaccines, those mutations will spread.

We can, though, slow the speed of the spread of any mutation through – you guessed it – social distancing, mask wearing and test-and-trace protocols. So these may eventually be relaxed as vaccination reaches more and more people, but it is hard to see them ever going away completely. I don’t expect you are going to be let into a hospital without wearing a mask for very many years to come, for instance.

Once we come to terms with the fact that we are here for the long haul we need to start reordering our society in that light. One of the things that surely must follow is some form of immunity/vaccination passport.

Until recently I thought this was a terrible idea – but since I recognised the truly long-term nature of the threat I have come to see such passports as inevitable, and necessary, and so the key issue is how they are introduced and used.

My initial thoughts are that firstly they should be a citizen’s right – everyone should be able to get one and access shouldn’t depend on wealth.

Secondly they should be regulated to an international standard that, as far as is practical, protects privacy and avoids unnecessary state monitoring. Or to be more direct: if the Russian (or any other) state wants to insist its citizens carry the equivalent of an electronic tag with them everywhere there isn’t much we can do to stop it, but we could say such devices are not recognised for use here.

Thirdly – and related to the first point – with the obligation to have one should come the right to access services. Public bodies or other service providers might have legitimate reasons to restrict access to those who have been vaccinated or are otherwise certificated, but they should not be able to refuse access to anyone who meets the criteria either. In other words if your body or company requires access to the information the passport contains then it must also submit to the responsibilities that come with it.

Still out there


Surprised and pleased to find that, a quarter of a century after I released it to a distinctly unmoved world – and a decade after I first mentioned it on this blog – the first piece of software I published, a not particularly brilliant program that allowed you to predict the result in a given UK constituency from a national opinion poll, is still available on an FTP server – ftp://ftp.demon.nl/pub/Museum/Demon/ibmpc/win3/apps/election/

Can’t actually run this on a 64 bit Windows system and the source (in Borland C++) is long-gone…

After neo-liberalism: back to Bell Labs?


In general I hate the term “neo-liberal” – as in the last decade it has become a fashionable way for some people on the left to say “things I don’t like” whilst giving their (often irrational) personal preferences a patina of intellectual credibility.

Glen O’Hara looks at the accusation that the last Labour government was “neoliberal” in some detail and I’m not going to reheat his arguments here, but as he says:

This rise in public spending was not only imagined in liberal terms—as a new contract between consumers and providers. For the emphasis on neoliberalism also misses the fact that the Blair agenda sought specifically to rebuild the public sphere around a new vision of a larger, more activist but more responsive and effective state. First through targets—and then, when they seemed not to deliver strong improvement, through decentralised commissioning and choice—the government sought to improve public-sector performance in a way that would be visible on the ground, and so maintain its relevance and political support.

But the term is not in itself meaningless – but personally I find it much more useful as a tool of analysis when applied to how governments across much of the western world have approached the private (and not the public) sector over the last forty years. For sure there has been privatisation too – expanding the role of the private sector, but certainly in the UK the long-term picture has not been a story of shrinking state, but of a state spending money in different ways. See the chart which plots the share of public spending as a proportion of GDP since the end of the 1950s – and notably this does not include the massive covid-19 driven spike of public spending in 2020 and 2021.

What has diminished is both state-ownership and (much more more importantly, I believe) state-partnership with key economic sectors that provide private goods and services – until, perhaps, that is, today (as I discuss below).

As my example (from the US but the argument applies more widely), let me look at AT&T in the United States. Today what was once the American Telephone and Telegraph Company is still the world’s largest telecommunications concern, but it’s a very different beast to the company of that name of forty years ago. Now it competes in a cut-throat global market, then it was a highly-regulated, privately-owned classical monopoly utility.

No doubt its break-up from 1984 onwards meant Americans got smaller phone bills (if they use land lines at all) but what has the overall balance for society been?

Reading Brian Kernighan’s UNIX: A History and a Memoir and the earlier The Idea Factory you get the impression that subjecting corporates to cut-throat competition has not all been about wins for the consumer. The “Bell System” monopoly paid for a massive research operation that delivered the transistors that made the digital age possible and the Unix that now dominates, and an awful lot else besides.

AT&Ts share holders didn’t repeat the massive windfalls seen by people who invested in Amazon twenty years ago but their stocks paid a consistent dividend and the economy in which they operated also generally grew steadily. Investors got a stable return and AT&T also had the ability to risk capital on long-term research and development.

The neoliberal revolution in the private sector has indeed given us Amazon (and Apple) and with it massive disruption that often is beneficial to humanity as a whole (think of the evaporation of poverty in much of east and south east Asia). But has it delivered fundamental advances in human knowledge of the scale and power that the older regulated capitalism did? I feel less than fully convinced.

The counter-case is, of course, in the field of bio-medicine. The enormous productive power that a globalised capitalism possesses is, even as I write this, churning out the product of the most spectacular scientific effort in all human history – vaccine against covid-19. No previous human generation has been able to do what we now believe we – for very good reasons – can: meet a newly emerged global epidemic in open combat and win.

But the story of the vaccine is also a story of partnership between state and capital. Governments have shared risk with the pharmaceutical companies but competition has also played its part – to me it suggests a future beyond a neo-liberal approach to the private sector in key industrial areas. The state should not be trying to pick winners but sharing risks and building an economic eco-structure where a balance of risks and rewards means that the aim is not to find the next 10000% return on investment but where good research can be allowed to thrive.

I know this is in danger of sounding very motherhood-and-apple-pie and we should be weary of just propping up existing market giants because they happen to be market giants. So let me also make a suggestion – imagine if the UK government decided that, instead of spending large amounts for ever on office suites from large software houses that are installed on million upon million of computers in schools, hospitals and police stations, it indicated it was willing to pay a premium price for a service contract, for say 5 – 7 years for someone who could turn one of the existing free software office suites into a world-class competitor and, more than that, it was willing to provide capital, as an active investor, in the two or three companies that could come forward with the best initial proposals?

The private sector would be shouldering much of the risk but would be aiming for a good reward (while free software’s built-in escrow mechanism would also mean that the private contractor couldn’t just take the money and ‘steal’ the outcome). Ultimately citizens (globally) could expect to see real benefits and, of course, we would hope any current monopolist would see competition coming and be incentivised to innovate further.

The missing link and closing schools


London, where I am writing this, is now perhaps the global centre of the covid19 pandemic, thanks to a mutation of the virus that has allowed it to spread more easily. This mutation may not have come into existence in the South East of England but it has certainly taken hold here, and about 2% of London’s population currently have symptomatic covid.

In response all primary and secondary schools, which were due to open tomorrow, will be effectively closed and teaching will go online.

Suddenly the availability of computing resources has become very important – because unlike the Spring lockdown, where online teaching was (generally) pretty limited, this time around the clear intention is to deliver a full curriculum – and means one terminal per pupil. But even now how many homes have multiple computers capable of handling this? If you have two children between the ages of 5 and 18, and two adults working from home it is going to be a struggle for many.

Thus this could have been the moment that low cost diskless client devices came into their own – but (unless we classify mobile phones as such) they essentially don’t exist. The conditions for their use have never been better – wireless connections are the default means of connecting to the internet and connections are fast (those of us who used to use X/Windows over 28kbit dial-up think so anyway).

Why did it not happen? Perhaps because of the fall in storage costs? If the screen and processor costs haven’t fallen as fast as RAM and disk then thin clients get proportionally more expensive. Or perhaps it’s that even the fat clients are thin these days? If you have a £115 Chrome book then it’s probably not able to act realistically as a server in the way a laptop costing six times as much might.

But it’s also down to software choices and legacies. We live in the Unix age now – Android mobile phones and Mac OSX machines as well as Linux devices are all running some version of an operating system that was born out of an explicit desire to create an effective means to share time and resources across multiple users. But we are also still living in the Microsoft Windows era too – and although Windows has been able to support multiple simultaneous users for many years now, few people recognise that, and even fewer know how to activate it (especially as it has been marketed as an add-on and not the build in feature we see with Unix). We (as in the public at large) just don’t think in terms of getting a single, more powerful, device and running client machines on top of it – indeed most users run away at the very idea of even invoking a command line terminal so encouraging experimentation is also difficult.

Could this ever be fixed? Well, of course, the Chrome books are sort of thin clients but they tie us to the external provider and don’t liberate us to use our own resources (well not easily – there is a Linux under the covers though). Given the low cost of the cheapest Chrome books its hard to see how a challenger could make a true thin-client model work – though maybe a few educational establishments could lead the way – given pupils/students thin clients that connect to both local and central resources from the moment they are switched on?