Anyone got any thoughts on the LaTeX companion?

Should I shell out £23 for The Latex Companion to ensure I can most effectively write my documents and design my slides for university? I have The LATEX Graphics Companion and there is no doubt it is a good book, but the number of books I could buy increases exponentially the more I think about the work I need to do. So, anyone have any practical experience with the book’s usefulness to a computer science research student with a middling level of LaTeX experience, who is likely to use LyX for a lot of his work?

Betamax versus VHS in your browser

Everyone knows the story, even if, unlike me, they are not old enough to remember it: the video format VHS overcame the superior Betamax format to dominate the Mitt Romney laughing GIFhome video market in the 80s and 90s.

Of course, the claims that Beta was superior are rather tendentious but the fact that video producers stuck with a Beta format for their own tapes long after the rest of us switched to a VHS monoculture must say something.

Now, inside your browser, the same thing has happened. As is noted in today’s Guardian, the 1987-vintage GIF format refuses to die. These days you see far fewer static GIFs than even a few years ago (though they are still out there in large numbers) – JPEG and PNG dominate. But you’ll have to look very hard for an MNG (the ‘official’ PNG analogue of the animated GIF) or even APNGs, the unofficial but more widely used attempt to animate PNGs.

A few years ago I wrote a Perl PNG module – Image::Pngslimmer – which replicated many of the functions of the C libpng library, so I could use some of that in CGI code without having to switch from Perl to C and back again. Then – this was 2006 or so – PNG support was quite weak in browsers and GIFs were far more plentiful.

PNG is a superior format to GIF (especially for line drawings and similar – for general photographs JPEG is the superior choice, unless you truly demand a lossless format) and it is a good thing that it has edged GIF out in many places. But it seems we will be stuck with GIFs for many years yet.

What really drives Moore’s Law

Gordon Moore on a fishing trip
Gordon Moore on a fishing trip (Photo credit: Wikipedia)

Moore’s Law” is one of the more widely understood concepts in computer hardware. Many ordinary people, including those with little or no understanding of what goes in to making an integrated circuit, understand the idea that computer hardware becomes better (and cheaper) in some sort of geometric way. But, actually, the rate and reasons for this “law” have been in flux ever since it was first proposed in 1965.

As part of the very first baby steps on my road to getting a PhD I have to prepare a literature review and so that means demonstrating an understanding of the processes that mean while Gordon E. Moore‘s predictions about increased transistor density have broadly held-up, the era of ever faster single (or even small number) processor computers is decisively over.

So I have read this paper – Establishing Moore’s Law – which gives some interesting perspectives on what we now refer to as “Moore’s Law”.

The term “Moore’s Law”: Although coined by Carver Mead of CalTech the term was popularised by Robert Noyce, Moore’s co-founder at Intel, in an article in the Scientific American in 1977.

The original formulation was a prediction based on the need of IC makers to compete: In 1965, when Moore first formulated what we now call his “law”, it was actually a prediction based on the need of manufacturers of integrated circuits – then a relatively new and experimental technology – to compete with the manufacturers of single electronic components. His prediction was that manufacturers would need to radically cut the costs and increase the complexity of their products if they were to successfully compete.

The ‘law’ broke down in the late 1960s : From around 1965 to 1969 the ‘law’ didn’t work in the sense that the growth in chip speed and complexity did not match the prediction. In Moore’s view this was because manufacturers did not produce chips “whose complexity came close to the potential limit”.

The ‘law’ is not based on any fundamental character of ICs but on a wide interplay of technology and commercial factors: This is the overall theme of the paper – the early gains were based on the need for ICs to compete at all, then came gains founded on the introduction of computer aided design and then from a state sponsored drive by Japanese companies to gain a foothold in the market (in fact they came to dominate the market in memory technology). Then came the Microsoft-Intel alliance and on to the present…

The coming HTML5 disaster

HTML5 official logo (official since 1 April 20...
HTML5 official logo (official since 1 April 2011, (Photo credit: Wikipedia)

About 18 months ago I got my first Android phone. One of the first applications I downloaded on it was for Facebook. It had some quirks but it worked fine.

Not long after I was prompted to ‘upgrade’ to the next version, which I duly did.

The supposed upgrade was (and is) a disaster. Slow, difficult to understand, a mess.

I had always wondered why Facebook had not simply rolled back the upgrade and tried again. But now I know. To cut their costs they had based their iOS and Android applications on a common HTML5 core. A common code base eliminated the need to maintain two separate blocks of complex code, presumably with two sets of developers.

But it didn’t work. By all accounts the iOS version made the Android one look slick and this week it was axed in favour of an Objective C based application. Hopefully a Java based Android replacement is also in the works.

But I suspect sloth will be the least of HTML5’s problems. Turning mark up into executable code just sounds like a recipe for trouble and it’s only just started.

Stopping the patent madness

The BBC’s lead story in Britain today is about a decision of a US court which no direct or immediate applicability in Britain at all – over Apple’s victory over Samsung in a US patent case.

The fact that the case, at least in theory, matters not a jot to Britain seems to have rather passed the BBC by. It is not even the case that Samsung’s behaviour is likely to impact on how European regulators might see the company, as the plain fact is that software patents have no legal standing in European lawand in Britain it is unlawful to grant a patent based on a mathematical procedure, which is what an algorithm (a software procedure) is.

English: United States Patent Cover from a rea...
English: United States Patent Cover from a real patent issued (Photo credit: Wikipedia)

The basis of the British law seems sound to me. No one makes an invention when they discover a new mathematical procedure – the maths is universal and has always existed (I know that there are philosophical arguments to the contrary, but I’m not buying them!).

But thanks to the BBC Apple have now won half the battle in the UK – many consumers will now think a Samsung product tainted or perhaps even illegal, thanks to some pretty poor journalism.

Yet there is an even deeper threat. Politicians, of all parties, in the UK are subject to some heavy-duty lobbying and pressure from “rights holders” in all fields to extend and deepen the patent and copyright regime. The problem is not one of corruption – politicians, in my experience at least, are not being paid or even offered money to advocate this position.

Instead, with the economy weak, they are being told that extending patent and copyright is the best way to protect jobs and build exports. It’s garbage, frankly: the best way to protect jobs and extend exports is to innovate and create, not to rely on rent from past creations – but it is being listened to.

Already this year European governments and legislators sanctioned stealing property from consumers by retrospectively increasing the copyright protection period on performances. Items which had entered the public domain were simply stolen back from the public – hardly as outrageous as the Enclosures Acts, but exactly the same principle with the state acting to unilaterally enhance the financial health of the largely already wealthy.

From my time working in government and for the Labour Party in government I know this was a long term aim of the music industry and that Labour ministers who ought to have known better were seduced by the argument that as Britain had been such a pioneer in the global popular music industry it was in the ‘national interest’ to legislate in this way. The alternative argument – that it was in the national interest to encourage innovative ways to use works created half a century ago – was dismissed out of hand: having had their hands badly burnt when the dot com bubble burst ministers were not keen to listen to another bunch of ‘new economy’ arguments. The current government seems similarly bewitched.

But there is a deeper argument here too. Copyright and similar protections (such as patents) are privileges granted by the state to encourage innovation for the benefit of the public. There is no ‘natural’ basis for copyright – if I perform something in public I should expect people to copy it. But because such copying might discourage me from subsequent public performances I am granted a special legal and time-limited protection. The law exists not to benefit me, but to encourage me to act in a way that benefits the public. Extending copyright protection from 50 to 75 years has no such public benefit – it merely benefits the copyright holder, and it is simply not credible to suggest that there will be new works created today because protection has been extended that would not have been created because previousl protection only lasted 50 years.

So too with patents. These should exist to encourage genuine invention (ie., not to privilege the discoverers of that which already exists) in such a way that benefits the public. If patent law allows one company to establish a monopoly on a vital technology then it is time to rethink it all.

Sticking with

Thanks to Slashdot and a couple of other places, this last week has seen the blog visited by close to 60,000 browsers – an impressive figure in anyone’s book. But almost all (about 50,000) of that was compressed into last weekend and at the peak there were perhaps 150 new visitors each minute.

Now, I know for personal experience that, with good design and testing, a dedicated web server on a machine with enough memory could probably handle that – but I also know it is difficult to guarantee it. handled it all with ease, so although I am not likely to have such a busy period again any time soon, I will be sticking with the free service.

Ketosis, diets and human spontaneous combustion

My report of Brian J. Ford’s work on this seems to have generated quite a bit of traffic – including some people on ‘keto’ diets ridiculing what they seem to think is a suggestion they are in danger of bursting into flames.

Ford’s original article in The Microscope is now available for public reading (warning, it contains some grisly pictures). So I’d suggest doing that. But I hope the journal won’t mind if I quote his final two paragraphs:

For the very first time we have a plausible model for spontaneous human combustion, which offers a natural metabolic explanation for this well-docu- mented phenomenon. The model experiments match the descriptions of cases involving people. What are the practical implications? First — and this is important — do not accuse youngsters of solvent abuse, just because they have acetone on their breath. They are more likely to be unwell, or on a fat-free diet. Second, if you are suffering ketosis, it might be wise to avoid wearing synthetic fibers with the likelihood of static sparks. Is there a reasonable chance of dying from SHC? That can’t be right — the reported episodes have been extremely rare so it is far too unusual to cause anyone serious concern.

On the other hand, if you’re susceptible to ketosis, now might be the perfect time to give up smoking.