Problems with oprofile

Image via Wikipedia

For one-last-thing with my report I want to profile the kernel in a specific configuration and so thought I would try oprofile instead of the cruder profile=X command line options.

Big mistake.

Essentially I could not get it to run under KVM at all. KVM hides many hardware details from the profiler and set up is notoriously difficult (I now know). Apparently this is fixable, but not simply and there is very little information out there about how to do it. If I had days to learn maybe I would persevere, but I don’t.

But I did come across one feature/bug in oprofile that I will document a fix for in the hope it proves useful to someone.

To start oprofile off (to profile the kernel), one has to specify where a vmlinux file (note, not a compressed vmlinuz or bzImage etc) or similar is.

Mine were of the format vmlinux-3.0.0-sched+ but oprofile consistently failed to let me specify that: again I did not have time to go into the details but is was clear it was the + that was the issue. I renamed the file and all was fine.


Best book on Linux kernel internals

Write an MSc project report means having to read a lot of source code and constantly referring to texts in the hope that they will make things clearer.

I have three books on the kernel – there are obviously others, but I think two of these three will be familiar to most kernel hackers – but it is the third that I rate most highly.

Understanding the Linux kernel
Understanding the Linux Kernel: for many this must feel like the standard text on the kernel – it’s published by O’Reilly, so (my experience with their XML Pocket Reference not withstanding) will be good, it’s well printed and readable and it is, after all, the third edition of a book your forefathers used. But the problem is, it is also now six years old and  a lot has happened since then. I well remember going into Foyles in the autumn of 2005 and seeing it newly minted on the shelves. For a long time it could hide behind the fact that the kernel was still in the 2.6 series, but even that protection is gone. Verdict: Venerable but getting close to past it.




Linux kernel developmentLinux Kernel Development: the previous, second, edition of this book was a fantastic introduction to how the kernel worked and was written in a slightly whimsical tone which made it easier to read. It is rare that one can read a computer book like a novel, starting from the first page and going on to the end, but you could with that. Sadly someone seems to have got to Robert Love (who, from personal experience I know to be a great guy) and presumably told him he had to be more serious if he wanted his book to be a set text for CS courses. The problem is that the book now falls between two stools – somewhere between a solid but broad introduction to how the kernel works and a guide to writing kernel code. Unfortunately,  it does not quite hit either target. That said, it is still worth having. Verdict: Good, but where did the magic go?






Wrox kernel bookProfessional Linux Kernel Architecture : unfortunately, everything about the Wrox brand suggests “cheap and nasty”, which is a real pity, as this book is the best of the bunch. Admittedly it would not, as Robert Love’s book, be at all suitable as a primer for operating system study – it is far too big and detailed for that. But if you are looking for an up to date (or reasonably so, anyway) helpmate for actually patching the kernel, then this seems to be the best choice. Sadly the cheap and nasty side does creap through on occasion with bad editing/translation, but it’s not enough to stop me from recommending it. Verdict: this one’s a keeper, for now any way.

My first R program

Having used Groovy (which makes the scripting environment feel familiar) and some Scheme (via Structure and Interpretation of Computer Programs), R does feel completely alien, but it still feels like a steep learning curve.

But here’s my short script –

unpatched <- read.csv("~/unpatched.txt")
unpatchcons <- transform(unpatched, realm=realm*60 + reals)
plot(size, realm, log="y")
abline(reg=linelog, untf=TRUE, col="blue",lty=3)

And here’s the graph (of Linux kernel compile times) it generates – the blue line is obviously a very bad fit!

Linux kernel compile times

Testing OpenGrok

Programming in the large and programming in th...
Image via Wikipedia

Hacking at the kernel means using a Linux Cross Reference (LXR) is pretty much essential.

I have set one up on my own servers before, but it was difficult to maintain and the performance was poor.

But I am trying out the OpenGrok tool now – this was quite easy to install once I realised the thing to was not to read the various online descriptions of what to do, but to look at the README file that came with the binaries.

First impressions … it looks nice but I am not sure it is really up to it.

You can try mine at

Linux 3.0

Red Hat Linux 7.0
Image by Pitel via Flickr

Ten years ago next week I booted my first Linux machine – Labour had just won its second landslide and I had a few days off work after the election and so went out and bought a PC from the late lamented Morgan Computers and – dear reader, pity my naivety – paid about £50 or so for a boxed copy of Red Hat Linux 7.0 (still cheaper than the alternative).

I didn’t really know anything about Linux beyond a few basic shell commands and it was a steep learning curve. But I have never looked back.

Today I have just updated my git repository as I work on setting up my project and have been shocked to see that Linus Torvalds has baptised the latest version of the kernel 3.0-rc1 – my version of Red Hat was the first they had released as a 2.4 series kernel.

Onwards and upwards.

What’s next?

Students taking a test at the University of Vi...
Image via Wikipedia

My first exam in the second year of the (part-time) MSc is tomorrow and I guess I am writing this blog partly as a way of avoiding more revision, but partly also because if last year’s experience is any guide, that exam will knock the stuffing out of any optimism I have, so I shall write something now while I still have some hope.

The exams are not the end of the degree if I pass them then technically I can claim a post-graduate diploma, but I already have one of them, in Journalism Studies from Westminster and as was said to me at the time “it’s just about worth the paper it is printed on”: I learnt a lot but nobody much else is impressed.

To get the degree I need to complete my project on memory management in the Linux kernel – it’s an ambitious project and time will be short so it may get frantic.

But when it’s over, what will I do? I don’t plan to work in IT: 45 seems quite an age to go from reasonable success and some prospects in one career to starting at the bottom in any case.

But nor do I want to abandon science for a second time. A part-time PhD? That really is a long term commitment, though.

Flattered by spam

Image via Wikipedia

I hate spam. Like everyone else on the internet.

But maybe not all spam. I now get occasional emails from two technical recruiters in the United States asking me if I want a job as a Linux kernel engineer. As I have never signed up with Texan technical recruiters I assume they got my email address from either greping the kernel or LKML.

Either way I admit to being slightly flattered by it, even though they have almost certainly sent out thousands of these things.

“Analyzing Computer System Performance with Perl::PDQ”

Kernel panic
Image via Wikipedia

My MSc project is ambitious, sometimes it seems too ambitious: to test various ways in which the Linux VMM and scheduler could be modified to improve performance under high levels of stress.

Books are an essential guide in computer science, perhaps the essential guide, and having a book that helped me get to grips with issues such as measuring the Linux kernel’s use of the VM subsystem and how to design tests of application behaviour and page faulting rates and so on is increasingly important.

So, is Analyzing Computer System Performance with Perl::PDQ any good (found it through trawling round Amazon, but please send me any links to anything else from scientific paper upwards)? Anybody got some alternative suggestions?

Getting peace of mind – at a price

Image representing Unfuddle as depicted in Cru...
Image via CrunchBase

I confess that one of my biggest fears has been that any work I do on my MSc project will be lost in some sort of catastrophic computer disk crash.

Git – and the various free services available, such as github, offered a good way of backing the work up. But given that the project is based on the Linux kernel which is bigger than the official maximum size of a github project, it looked less than hopeful.

(That was not helped by their support staff – one of whom appeared to tell me I should delete the binaries in the kernel and I should be ok.)

So I have gone for unfuddle. It’s not free – $24 a month for a 2 GB repo – but it should do the job with no problem.

Slow progress on the project proposal

Crashed payphone -- Linux kernel panic

I have been making slow progress on my project proposal – some times it has felt like a mirage: the further I go the further away the real target seems to be.

But I am getting there – though I seem to have written five pages of dense type explaining how Linux paging developed and works without actually describing any problems or what I intend to do about them.

Well, the plan now – and I am writing this down as an aide memoire/encouragement to actually do it is to move on from where I am now – describing the 2Q-like LRU lists in the Linux kernel, to some of the problems, describing the alternative “working set” approach (eg as used in Windows NT and before than VMS) and then some of the strategies and tactics that could be used in Linux to apply it.