I know that, around this time of year, a lot of people in London are considering whether to press on with their application to go to Birkbeck college, so here I am hoping to pick up the passing Google traffic and urging you to go if:
- You are prepared to do the work;
- You want to realise your potential;
- You want to change your life.
Going to Birkbeck will not create a new you – but it could allow the real you to escape for wherever it has been hidden these last few years.
And it is never too late to change.
So, go for it!
- Stanford Makes Bid for Birkbeck’s Rumfitt (leiterreports.typepad.com)
- Birkbeck’s Arts Week: Birkbeck Forum for Nineteenth-Century Studies events (5/20-21/2013) (navsa.blogspot.com)
- Bakewell defends part-time study (bbc.co.uk)
- 15K in May: writing in the Tang Dynasty (ruthlivingstone.wordpress.com)
- Lecture Series: Birkbeck Forum for Nineteenth-Century Studies 2013 Summer Term Programme (navsa.blogspot.com)
- Conference: ‘Boycotts – Past and Present’, International Conference, 19-21 June 2013, Pears Institute for the Study of Antisemitism, Birkbeck, University of London (britishjewishstudies.org)
Personally I love Wikipedia and have dabbled in editing various entries in it in areas where I have some reasonably expert knowledge over many years and can still see some fragments of my edits from many years ago in the much expanding encyclopedia.
But I have also recognised for years that there are some serious biases in it: and let me give you a few examples…
Many years ago I stumbled across the entry for Sasolberg – an industrial town in the Free State province of South Africa. This town paid a highly significant part in the modern history of South Africa – for it was here in 1980 that the cadres of Umkhonto we Sizwe (MK), the armed wing of the African National Congress, signalled to the world – and more importantly to every South African – that things were different now, by blowing up one of the highest profile parts of the apartheid regime’s efforts to evade a spreading oil embargo, an oil fabrication plant.
But repeated efforts to get this into the Wikipedia article were resisted and eventually reduced to one sentence – and a cursory look across the very thin entries on South Africa show, to me at least, a systematic bias against the ANC.
For another South African example have a look at the entry on the Soweto Uprising – surely one of the most significant events in the country’s history whatever way you look at it but only given a very short entry.
Things are getting better – there are now quite a large range of articles on South Africa’s liberation struggle, when only four or five ago there were very few.
But, as a recent article in the New Scientist points out, Wikipedia has a real blind spot when it comes to covering to Africa – there are more articles on “Middle Earth” than many African states and there are perhaps 10 times as many wikipedia edits (in any language) originated in the United Kingdom than in all of Africa.
And that’s not the only problem – 91% of Wikipedia editors are male and, of course, that is contributing to Wikipedia’s growing reputation as the home of the same sort of maladjusted and poor socialised individuals who inhabit various parts of the “open source” software world.
One thing the article does not cover is the high rate of articles on the encyclopedia that are clearly being edited by the subject (or their paid agents) – or their partners: have a look at Scottish independence supporting tax exile Jim McColl‘s entry and the entries made by “Shona7″ – Jim McColl’s wife happens to be called Shona, that’s all I am saying!
- 50 more of Wikipedia’s most interesting articles (titifoti.wordpress.com)
- School of Open offers free Wikipedia course (wikimedia.org)
- The Economist explains: Who really runs Wikipedia? (economist.com)
- Who speaks for the women of Wikipedia? Not the women of Wikipedia. (hastac.org)
- Live map of recent changes to Wikipedia articles is mesmerizing (arstechnica.com)
It looks like the Parallella, the Network-on-Chip device funded through Kickstarter (which I backed) is going to happen, if only a little late:
Great news, the Parallella is now a real computer!! The gigabit ethernet port is working and the full Ubuntu desktop version is up and running! The full story is posted at parallella.org.
Our goal is to ship the first 100 Parallella boards to early access backers by the second week of June.
Since we are getting close to shipping the first boards, here are a couple of reminders.
Don’t worry about sending us your shipping addresses. We will send out an email to confirm shipping addresses before shipping. Just make sure you put firstname.lastname@example.org in your “good guy” contact list to avoid the email getting lost in your spam filter.
If you you want to order more Parallella boards you need to make a reservation by providing your email address at parallella.org. Wed have 11K reservations so far! We will soon allow pre-purchasing of Parallella boards and those who reserved a board will get first dibs. (NOTE! Pre-purhcased boards will only ship after we have fulfilled all of our commitments to you (our KS backers).
- Adapteva shows off production Parallella mini ‘supercomputer’ boards (engadget.com)
- Parallella runs Linux! (kickstarter.com)
- Parallella: The $99 Linux supercomputer (zdnet.com)
Peter J. Denning is something of a computing hero to me. He formulated the concept of the working set, around which I based my MSc project and he has been personally very kind to me in reading and commenting on that report.
So, the article has written (for the ACM), with Nicholas Dew, in which they debunk the idea of the “elevator pitch” as a key business tool is doubly interesting, as, in my “real” job, elevator pitches are regarded as essential tools of the trade:
To make this work, you need to re-interpret the pitch. It is not a transmission of information but an offer to havea conversation. It is often much easier to ask someone to join you in a conversation than it is to present a polished, sticky, commercial-grade presentation. A conversational pitch will get you closer to your idea being adopted.
Well worth reading the whole thing – it’s not long, though not quite of elevator pitch length either!
British readers of a certain age may remember a groundbreaking TV series from the autumn of 1979 – The Mighty Micro – in which Christopher Evans discussed the impact of the coming microchip revolution. (The series was broadcast after Evans had died, aged just 48).
In many ways the programmes – from what I can remember (and I was an avid viewer) – rather underestimated the impact of what was to follow. But the last programme did – and still does – stick in the memory because of what seemed, and seems, like a hyper-optimistic prediction: that microchips could save us from war.
Essentially Evans’s view was that by hugely increasing computational power, micro-powered computing would allow us to accurately predict the outcome of military conflict and so prevent it (why start a war when you know you are bound to lose or if you do win you, and your domestic critics, know it will devastate your society).
There are a lot of flaws in this argument. One only has to think of the jihadist claim to “love death” to recognise that the certainty of defeat might not be deterent enough and the 2008 financial crisis also demonstrates that increased computing power might just create new ways to mess things up, not to solve them.
But, but, but… maybe there is something to it after all. This week’s New Scientist reports on the release of the “Global Data on Events, Location and Tone” (GDELT) data set and the way it has been successfully used by Jay Yonamine, then a PhD student at Penn State, to model the spread of conflict in Afghanistan.
Yonamine was able to successfully model how the conflict would spread through Afgahnistan using GDELT, which geolocates major news stories and uses natural language processing to store a very short summary of them.
Modelling how the conflict spread is not the same as predicting where the next jihadist inspired conflict will take place though, of course, but it may be the first step on being able to draw out undercurrents of news stories and issue early warnings. The key question is whether it can be an effective leading indicator.
Maybe the idea has promise. At the very beginning of my memories of the world are the events of August 1969 – when the British Army was drafted on to Northern Ireland’s streets to avoid a bloodbath. Just six months before no one would have predicted that would have happened – even if the tempo of civil disputation had been increasing and certainly no one expected them to stay on the streets, as they did, for the next 30 years. And more importantly, perhaps, nobody – beyond some zealots on either side – would have wanted either outcome.
Again, think of the 2007 – 2008 financial crisis. Could it have been foreseen as early as 2004? Certainly some politicians claim that it could – but how could you tell whether they were any good at prediction: Mitt Romney,a pretty serious person after all, really believed that would be president even on the night of election day – does that mean everything he says is nonsense or just some things?
Big data might help sort some of that out too. Another piece of research highlighted by New Scientist, and undertaken by Tobias Preis at the University of Warwick, Helen Susannah Moat and H. Eugene Stanley of Boston University suggests that an investment strategy based on analysis of Google Trends could have made substantial sums over the 2004 – 2011 period (see graph).
Their abstract states:
Crises in financial markets affect humans worldwide. Detailed market data on trading decisions reflect some of the complex human behavior that has led to these crises. We suggest that massive new data sources resulting from human interaction with the Internet may offer a new perspective on the behavior of market participants in periods of large market movements. By analyzing changes in Google query volumes for search terms related to finance, we find patterns that may be interpreted as “early warning signs” of stock market moves. Our results illustrate the potential that combining extensive behavioral data sets offers for a better understanding of collective human behavior.
So, risking the wrath of John Rentoul, this could be A Question To Which The Answer Might Not Be No.
- GDELT: a big data history of life, the universe and everything (guardian.co.uk)
- World’s largest events database could predict conflict (newscientist.com)
- What can we learn from the last 200 million things that happened in the world? (beowulfjournal.wordpress.com)
- Google Trends Big Data For Predicting the Market: Deep Dive and Current Predictions (forbes.com)
- Mapping the GDELT data (and some Russian protests, too) (r-bloggers.com)
- Predictive Application Tracks Global Events (arnoldit.com)
- Google big data sets forecast stock market movements, researchers claim (itpro.co.uk)
- Can Wikipedia Predict The Stock Market? (shinesquad.me)
- Using Big Data to Save the Planet (slashdot.org)
- When will computers out think You? (workingthewebtowin.blogspot.com)
I did not vote for either party in the current coalition government in Britain and I doubt I ever will. But credit where credit is due – they have done quite a good job at beginning to fix (most) government IT procurement and actually made the claim of the previous Labour government that they were open to using free and open source software (FOSS) real.
Still they have not fixed all IT procurement – the politically driven “Universal Credit” project looks to me like it will make all the failed big procurements of the Labour years look like well-thought-through successes. The lesson is that when IT policy passes out of the hands of Francis Maude at the Cabinet Office – who has done so much to drive the politics on FOSS – then the government heads for disaster.
The latest, and possibly the biggest yet (as it might wreck the mobile data market if pushed too far) potential disaster seems to have been signalled in the “Queen’s Speech” – the formal outlining of the government’s legislative programme for the next year. In it the Queen, on behalf of the government, pledged:
In relation to the problem of matching internet protocol addresses, my government will bring forward proposals to enable the protection of the public and the investigation of crime in cyberspace.
This is to replace the so-called “snoopers’ charter” – a proposal, much like that from the previous Labour government (which the Tories scrapped within days of coming to office only to attempt to revive two years later), to force service providers to maintain records of all internet browsing and emails (records in the form of which computer interacted with which rather than the content of the communications) so that these might be accessed by the police and the domestic security service, MI5.
The revived proposal was squashed by the Liberal Democrats, the coalition’s junior partner, on “civil liberties” grounds. But now it seems that the Lib Dems have also been persuaded something needs to be done and so have backed an idea – agreed with the Tories – to give every device that connects to the internet its own IP address.
Great idea! Except – well, it just won’t work with the bulk of today’s internet.
It is quite difficult to know exactly what the government are thinking of, as the whole idea seems too cracked, but let us give an (entirely fictitious) example. Mr A. Docstudent has been accused of smuggling quails’ eggs into the UK but when the police raid his home they find nothing, they still suspect he’s been in touch with Mr B. I. G. Smuggler, the quails’ eggs kingpin, and that he has been sending emails from various university networks using the fake identity Ms N. O. Clu.
The problem is, without a record of when Docstudent authenticated his devices against the international Eduroam network they cannot even prove Docstudent was on a university campus at the time. But if everyone of Docstudent’s devices had a unique internet address then they could simply point to that and say “you used your Psion 3 to send that one, you used your Sinclair ZXMobile to send that one” and so on.
So an easy solution is to give every device a unique IP address – after all all your devices already have a hardware identification through the so called “MAC address” which is unique to your machine. Force retailers to log who has which unique address (which can be based on the MAC address) and you do have something of a nightmare of a register, but it’s simpler than the “snoopers’ charter”. Or we could just ignore that and just go for raids and test the MAC addresses of all seized devices against the unique identification.
And we even have a fully worked out way of converting MAC addresses into unique network addresses – via version 6 of the Internet Protocol (IPv6).
And this is where it all falls apart because no one, or hardly anyone, uses IPv6, no one, or hardly anyone, knows how to set IPv6 up and no one, absolutely no one, has shown any willingness to pay the costs of converting today’s IPv4 networks into IPv6 networks.
For good technical reasons – IPv4 has run out of addresses – there have been excellent reasons to convert to IPv6 for years but the fact that it has not happened tells its own story. If the government mandated that all new network connections had to be IPv6 then most ISPs would likely go out of business and goodness knows what would happen to the mobile phone network (where providers currently treat their networks as walled IPv4 gardens and use the private nature of their networks to keep bad things out and to contain contagions inside their own networks – having a unique, internet visible IPv6 address would likely shatter that).
And, in any case, I am not convinced it would work. All Docstudent would have to do is route Clu’s emails via a foreign IPv4 server and the IPv6 address would be shaved off – unless, that is, the government proposes to cut the UK off from the rest of the world!
Anyway, that is my reading of what is going on here and why it is likely to fail, leaving Home Secretary Theresa May with huge quantities of egg on her face and wasting a lot of public money if this is ever tried seriously. If someone knows a way they could do this without these problems, step right up and set me to rights!
- BT Retail Tests IP Address Sharing (techweekeurope.co.uk)
- Centre unveils IPv6 roadmap (thehindu.com)
- Basic IPv6 concepts (javacodegeeks.com)
- Benefits of IPv6 – its Time to Adapt (cjnetworks.wordpress.com)
- Customers fume as BT introduces IP sharing (pcpro.co.uk)
- BT Begins Customer Tests of Carrier Grade NAT (Slashdot) (tech.slashdot.org)
- What’s the Best IPv6 Transition Option for You? (circleid.com)
- INET Denver considers Internet life without IPv4 addresses (arstechnica.com)
- A Primer on IPv4, IPv6 and Transition (circleid.com)
- A MAP to Easier, More Scalable IPv6 Deployments (blogs.cisco.com)
If you want to give yourself a good chilling scare then start reading articles on H7N9 – the mysterious and fairly deadly new strain of influenza that appears to be very widely spread amongst poultry in Eastern China.
No one seems very sure about how it is spreading (thankfully this appears to be just from bird to bird at the moment) or even which particular birds are the main vectors: there is some suggestion it could be city pigeons who are spreading it to poultry in markets, rather than the other way round.
But what is worse, far worse, is that if it did mutate and spread via human to human transmission we might be defenceless against it for a long time.
Now, if (or more likely, when) that happens the mutation may mean that it loses its current deadly force (as I understand it, at the moment scientists can only be sure that 20% of patients will recover – 20% have already died and the rest are ill – hopefully eventually to recover) but there is no guarantee of that. The 1918 flu did not lose its virulence on crossing into mammals and while H1N1 (swine flu) was not as bad as was once feared that was not because it was weakened on transmission.
Yet H1N1 has created a climate where politicians fear being accused of falling for scare stories and where the sort of viciously anti-science press we have in Britain would be the first to go on the attack if public money was spent to pump-prime anti-epidemic preparations. After all the Daily Mail still will not even acknowledge its despicable role in the anti-MMR scare which has caused a measles epidemic in Britain.
None of this is good.
- Scientist: Poultry trade may be spreading deadly bird flu (edition.cnn.com)
- Top US virologist: Prepare now for deadly bird flu mutation (rawstory.com)
- First case of new bird flu strain found outside eastern China – Reuters (reuters.com)
As I remarked before, the main problem with today’s chemistry sets is that they are inane to the point of boredom. I bought my eldest daughter one a few years ago and we both gave up on it because it was so tedious.
But that does not justify Tesco’s claim that they are only for boys!
This storm has just begun…
- The Good Old Days (eschatonblog.com)
- Chemistry Set Boasts “No Chemicals” (makezine.com)
- How to repel kids from science: By chaining curiosity in cuffs (blogs.scientificamerican.com)
- The Chemistry gift guide – Celebrating chemistry and inspiring the next generation of chemists! (makezine.com)
To draw the trees I used the venerable Reingold-Tilford algorithm, which is more or less the standard approach. I wrote some blogs about it and pages here seem to come pretty high up in Google searches for the algorithm, so I get passing traffic regularly as a result.
But idly chasing these links has led me to a chapter from the forthcoming Handbook of Graph Drawing and Visualization edited by Roberto Tamassia which has a chapter on tree drawing by Adrian Rusu, which contains bad news for us Reingold-Tilford fan boys, as this summary from the book of an experiment comparing algorithmic performance shows (emphasis added):
• The performance of a drawing algorithm on a tree-type is not a good predictor of the performance of the same algorithm on other tree-types: some of the algorithms perform best on a tree-type, and worst on other tree-types.
• Reingold-Tilford algorithm [RT81] scores worse in comparison to the other chosen algorithms for almost all ten aesthetics considered.
• The intuition that low average edge length and area go together is contradicted in only one case.
• The intuitions that average edge length and maximum edge length, uniform edge length and total edge length, and short maximum edge length and close farthest leaf go together are contradicted for unbalanced binary trees.
• With regards to area, of the four algorithms studied, three perform best on diﬀerent types of trees.
• With regards to aspect ratio, of the four algorithms studied, three perform well on trees of diﬀerent types and sizes.
• Not all algorithms studied perform best on complete binary trees even though they have one of the simplest tree structures.
• The level-based algorithm of Reingold-Tilford [RT81] produces much worse aspect ratios than algorithms designed using other approaches.
• The path-based algorithm of Chan et al. [CGKT02] tends to construct drawings with better area at the expense of worse aspect ratio.
- Speed Of An Algorithm (aishwaryr.wordpress.com)
Well, I think it’s improved anyway