How We Test Software at Microsoft

Author: Alan Page, Ken Johnston & Bj Rollison
Publisher: Microsoft Press, 2009
Pages: 405
ISBN: 978-0735624252
Aimed at: Developers and software architects, managers and general audience of software users
Rating: 4
Pros: Lots of amusing anecdotes, some clear theory
Cons: Lacks specific detail of testing techniques
Reviewed by: Mike James

A book on testing software at Microsoft! It’s a gift to anyone wanting to make a joke at the company's expense. Many a user would claim that, based on its products, there is no evidence that testing goes on at all and surely such a book would have very few pages. Of course this is unfair, but fun. Getting software right isn't easy and given the size of the Microsoft codebase you can expect one or two bugs to have slipped through. So a book on how the software giant tackles the problem is a good idea and should be instructive.

The first thing to say is that this is not the book to read if you want to know anything much about the theory and practice of testing. It does describe some of the basic ideas of testing, and it has some nice anecdotes that will raise a smile, but it really doesn't describe anything that isn't better described in other books. It isn't a course on how to test software, this much is clear, but what exactly the book is all about is more difficult to explain. It starts off with some reminiscences of Microsoft, tales of company days out and bonding. It describes how the idea of testing software seems to have occurred to Microsoft at what to the rest of us might seem like a very late point in the development of a software company. It covers how Bill Gates hired people who went to the same school as he did to do jobs he really didn't seem to understand. It explains how Microsoft realised that testing was a good idea and how they recruited people, created job structures and managed the whole thing. This first section of the book is best described as a guide to the testing management structures within Microsoft. The enthusiasm for the company expressed if probably appropriate for a Microsoft employee writing a Microsoft Press title but for the reader it very quickly becomes slightly embarrassing and gauche. It reads very much like a final year school student writing up a report of the year's activities certain that it will be read by the principal. Overall the effect is to make you think that Microsoft started out as an amateur organisation and stumbled its way to some sort of serious professional structure by shear enthusiasm and energy. A notion I personally can believe.

The second major section is about testing theory. For the general reader interested in testing rather than in Microsoft this is worth the effort as it is a well written and clear, but it doesn't go very far and you can find the same in almost any book on the subject. If you know nothing at all about testing then it’s a good introduction and might be useful if you have to introduce testing ideas to a non-programmer member of a team to get them more fully on-board. The theory laid out in this section can be used in conjunction with a variety of software testing tools available on the market.

Later chapters describe how testing is performed but you never really know if the author is telling you that this is the way Microsoft does it. To be clear, nowhere does the book detail how Microsoft tests any particular product and certainly not the flagship product - Windows. Again there are lots of anecdotes and occasionally you can't help feeling that Microsoft as a company was, and perhaps still is, naïve about how to do things. Even so you can't help be envious of their resources. The story about buying $20,000 worth of multimedia from the local computer store to test to see how Windows 95 worked with them is a case in point - ironically the point of sale computers couldn't cope with the number of items or the bottom line and kept on crashing. The testing was then actually carried out by asking employees to "adopt a multimedia title" - i.e. take a free program home, to keep, in reward for running it and testing it. Pragmatic - yes. The way to go in structured software testing? Probably not. I actually enjoyed this part of the book as it is an easy-to-read mixture of practical testing theory and tiny snipes of juicy information about how the "big boys" behave behind the scenes. For example, how did they stress test Office in the early days? The answer is a stapler laid across the keyboard to product a stream of character input. Well it worked! There's a lot of discussion of minor testing technique in fairly minor Microsoft products and you might well find all of this interesting but it doesn't answer any of the really big questions.

The final verdict has to be that this is a book that you could well enjoy reading but not for the promise of new knowledge but simply to get a tiny view inside the software giant that is Microsoft and to discover that it is sometimes just as clueless as the rest of us.

Banner


Software Requirements Essentials

Authors: Karl Wiegers and Candase Hokanson
Publisher: Addison-Wesley
Pages: 208
ISBN: 9780138190286
Print: 0138190283
Kindle: B0BTLC53FF
Audience: General
Rating: 4.5
Reviewer: Kay Ewbank

This slim book looks at how to work out the requirements for a software project through twenty 'practices' that you c [ ... ]



Accelerating Software Quality

Author: Eran Kinsbruner
Publisher: Perforce
Pages: 357
ISBN: 978-8671126044
Print: B08FKW8B9B
Kindle:B08FKWD2TR
Audience: Devops developers
Rating: 3
Reviewer: Kay Ewbank

With a subtitle of 'machine learning and artificial intelligence in the age of devops', this book certainly sounds as though it fits c [ ... ]


More Reviews

Last Updated ( Thursday, 22 May 2014 )