Learning more about neural networks

Cheap and accessible books on neural nets are not easy to find – so I bought “Practical Neural Network Recipes in C++” as a second-hand book on Amazon (link). According to Google Scholar this book – though now 24 years old – has over 2,000 citations, so it ought to be good, right?

Well, the C++ is really just C with a few simple classes – lots of pointers and not an STL class to be seen (but then I can hardly blame author Timothy Masters for not seeing into the future). The formatting is awful – it seems nobody even thought that you could put the source into a different font from the words. But, yes, it works.

Essentially I used it – though I had to get additional help to understand back propagation as the book’s explanation is garbled, mixing up summed outputs and outputs from activation functions for instance – to build a simple neural network which worked, after a fashion.

(In fact I didn’t build a fully connected network, because the book didn’t say – anywhere that I could see, anyway – that you should. I have rectified that now and my network is much slower at learning but does seem, generally, to be delivering better results.)

But it seems that 24 years is a long time in the world of neural nets. I now know that “deep learning” isn’t just (or only) a faddish way of referring to neural networks, but a reflection of the idea that deep nets (i.e., with multiple hidden layers) are generally thought to be the best option, certainly for image classification tasks. Timothy Masters’s book essentially describes additional layers as a waste of computing resources: certainly anything above two hidden layers is expressly dismissed.

Luckily I have access to an electronic library and so haven’t had to buy a book like “Guide to Convolutional Neural Networks” (Amazon link) – but I have found it invaluable in learning what I need to do. But it’s complicated: if I build many different convolutional layers into my code the network will be slow(er) – and it will be time to break out the threads and go parallel. But now I have fallen into this rabbit hole, I might as well go further.

Advertisements