Nine Algorithms That Changed the Future |
Author: John MacCormick Nine algorithms that changed the future - but which nine? The subtitle is "The ingenious ideas that drive today's computers" and the book is very much aimed at the average reader and not the expert. If you are an expert, or even know just a little about programming and algorithms, then much of the fun is in discovering which algorithms have been picked and how they are described. The very first thing to say is that this book isn't about the most interesting or amazing algorithms. If it was then the subject matter would be much more difficult to explain - not impossible, but difficult . For example, you won't find Quicksort or any of the divide and conquer algorithms. And you won't find any theoretical things like complexity theory. This is a selection of algorithms that have a claim to having changed things. From this perspective it is surprising that nine neat elegant algorithms can be pinned down - and, as will become apparent, perhaps they can't. After a general introduction, Chapters 2 and 3 deal with the search engine algorithms. First we have the idea of indexing and next the page rank algorithm. If you know even a little about programming, perhaps the idea that indexing is an algorithm worth discussion is more a reflection of the importance of the search engine. Page Rank is more worthy of attention. From here we move on to two chapters on coding theory. Chapter 4 introduces public key cryptography and Chapter 5 does the same for error correcting codes.
Chapter 6 isn't really about an algorithm at all. Pattern recognition is a diverse subject with lots of algorithms and the idea that it is somehow to be counted as one of the nine algorithms that changed the future is very strange. The chapter itself is best described as being all over the place - decision trees, neural networks, nearest neighbour algorithms and so on. Anyone who knows about pattern recognition will be disappointed that their own favourite algorithm isn't included. Chapter 7 is about data compression, but once again it isn't a single algorithm that is explained. We have both lossless and lossy compression explained, together even though they are based on very different theories. Next we move on to database - relational theory is interesting and the description attempts to explain why it is an important approach. Again this isn't exactly an algorithm, more a complete approach to a problem that is replete with multiple algorithms. Chapter 9 deals with digital signatures and returns to public key cryptography. It could have been included in Chapter 4 as a second section. Finally we have a round up of what is computable and a round out. The writing style is best described as steady and well paced. The one irritation, and for some readers it could be a big irritation, is the repeated used of the word "trick". Every algorithmic invention, or even anything that isn't 100% obvious, is called a "trick". This is slightly tedious and perhaps even undervalues the cleverness of some of the techniques. There is also the irritation for some that it would be difficult to make a list of nine clear cut algorithms that the book is supposed to be focused on. In many cases what we have is a subject area and a collection of the most common algorithms. This said, as long as you don't mind the repetition of "trick" and aren't expecting a selection of the best algorithms in the toolkit, then here is a book that will communicate what computers do to the average reader and for that you have to give it credit. Recommended to a wide audience, but not if you know enough about programming to have opinions about your own favourite algorithms.
|
|||
Last Updated ( Saturday, 16 December 2017 ) |