Was new maths really such a disaster?


English: Freeman Dyson
English: Freeman Dyson (Photo credit: Wikipedia)

On holiday now, so I fill my time – as you do – by reading books on maths.

One of these is Julian Havil’s Gamma: Exploring Euler’s Constant.

The book begins with a foreword by Freeman Dyson, in which he discusses the general failure, as he sees it (and I’d be inclined to agree), of mathematical education. He first dismisses learning by rote and then condemns “New Mathematics” as – despite its efforts to break free of the failures of learning by rote – an even greater disaster.

Now, I was taught a “new maths” curriculum up to the age of 16 and I wonder if it really was such a disaster. The one thing I can say is that it didn’t capture the beauty of maths in the way that the ‘A’ level (16 – 18) curriculum came close to doing. At times I really wondered what was it all about – at the age of 15 matrix maths seems excessively abstract.

But many years later I can see what all that was about and do think that my new maths education has given me quite a strong grounding in a lot of the fundamentals of computing and the applied maths that involves.

To the extent that this blog does have an audience I know that it is read by those with an interest in maths and computing and I would really welcome views on the strengths and weaknesses of the new maths approach.

A horror story with a happy ending (hopefully)


An LGM-25C Titan intercontinental ballistic mi...
An LGM-25C Titan intercontinental ballistic missile in silo, ready to launch (Photo credit: Wikipedia)

Command and Control is not a piece of light reading – in any sense. But it is an absolutely essential book.

 

It tells the story of the United States’s nuclear weapons programme from the Manhattan Project to the present day, with an emphasis on safety management (with the story of a particular accident in a Titan II missile silo in 1980 foregrounded).

 

Finishing it you are left wondering why you are there at all – because it is surely more by luck than design that civilisation has managed to survive in the nuclear age – particularly through the forty-five years of the Cold War when, more or less, fundamentally unsafe weapons were handed out willy-nilly to military personnel who were not even vetted for mental illness.

 

We read of how politicians – Eisenhower, Kennedy, Nixon, Carter – all tried (to various degrees – Eisenhower comes off worst as fundamentally weak man) to get some sort of grip on the nuclear colossus and all essentially capitulated to a military more interested in ensuring their weapons would work when needed, than they were safe when not.

 

The good news is that the book has a relatively happy ending: in that the end of the Cold War and the persistent efforts of a few scientists and engineers, deep within the US nuclear weapons programme, eventually led to safety being given a greater priority. The chance of an accidental nuclear war is probably less now than it has ever been – but the chance is not zero.

 

The book, per force, does not give us much insight into the Soviet (or Chinese, or indeed French, British, Indian, Israeli or Pakistani) nuclear programme – was it safer because state control was so much more strict (the fear of Bonapartism), or more dangerous because the Soviets were always running to catch up? The book suggests both at different points.

 

It’s brilliantly written too – so if you want a bit of chill to match the summer sun in your holiday reading I do recommend it.

 

 

Forty five years on


Archive: Apollo 11 Sees Earthrise (NASA, Marsh...
Archive: Apollo 11 Sees Earthrise (NASA, Marshall, 07/69) (Photo credit: NASA’s Marshall Space Flight Center)

I’m a day late here, but the sheer brilliance of the achievement of Apollo 11 means I have to write of it.

I was just three, but I remember the day well, watching the black and white images on the TV in the corner of the room in Donegal – where we were on holiday.

Apollo 11 has shaped my life in a very real way – a lifelong love of science.

Curses on ncurses


gdb icon, created for the Open Icon Library
gdb icon, created for the Open Icon Library (Photo credit: Wikipedia)

Every programmer will be familiar with something like this…

A little while back I wrote a program that simulates – crudely but effectively – a multicore NoC device. I use it to model the execution times of different page replacement algorithms.

The input is XML generated via a step by step trace of a working program. The actually instructions being traced do not matter – what I care about are the memory access patterns.

To allow me to test more models more quickly I have now written some R code that generates a semi-random access pattern based, very loosely indeed, on the patterns seen in the real program. The advantage is I can test against a set number of memory accesses but with a range of pseudo-random access patterns, so although I am not running models against the “real” access pattern, neither am I taking three weeks per experiment.

But when I used the artificially generated access patterns, my program crashed with a seg fault. But even more confusingly, when I ran the code in GDB, the GNU Debugger, if I stepped through the code it worked, but I just ran the code in debugger then it crashed just as it did without using the debugger.

After a few hours I realised why – in my artificial patterns, the first thing the first thread does is spawn all the other threads to be used. In real world code, of course, these spawns take place after quite some code has been executed.

Every code spawn causes the ncurses code I am using to update the screen. When using ‘real’ access patterns these updates take place comfortably after all the ncurses environment has been set up (by a separate thread), but in the artificial code, the thread updates are the first thing that get posted to the screen, even before ncurses has been set up – hence the crash.

If I step through the code then the ncurses thread runs ahead and sets up the screen before I hit the thread update code and again it works.

The solution? Use a condition variable and a mutex to ensure that nothing executes before the ncurses environment is fully established.

Not a big deal – but perhaps, at some point in the future someone struggling to understand why their code – which previously worked so well – has now stopped processing what seems to be well-formed input. Hope this helps!

Do I qualify as a LISP hacker now?


I have just managed to write my first working program (as opposed to a few fragments of code to produce some charts) in R.

It turned out to be harder than I expected: years with BASIC, C, C++, Perl etc mean I made assumptions about the atomic nature of numbers that R’s vector model choked on, and it took me some time to work it out.

Time to write a signal handler?


Unix Creators at DEC PDP11
Unix Creators at DEC PDP11 (Photo credit: PanelSwitchman)

I am trying to execute some self-written pieces of software that require a lot of wall clock time – around three weeks.

I run them on the University of York‘s compute server which is rebooted on the first Tuesday of every month, so the window for the software is limited. I have until about 7 am on 5 August before the next reboot.

To add to the complication the server runs Kerberos which does not seem to play well with the screen/NFS combination I am using.

And – I keep killing the applications in error – this time, just half an hour ago I assume I was on a dead terminal session (ie an ssh login which had long since expired) and pressed ctrl-C, only to my horror to discover it was a live screen (it had not responded to ctrl-A, ctrl-A for whatever reason).

Time to add a signal handler to catch ctrl-C to at least give me the option of changing my mind!

Follow

Get every new post delivered to your Inbox.

Join 956 other followers

%d bloggers like this: