The Magic Number Seven And The Art Of Programming
Written by Sue Gee   
Thursday, 19 March 2020
Article Index
The Magic Number Seven And The Art Of Programming
Interfaces

The number seven is very important in programming and many other intellectual endeavors. Why is is magic and what significance does it have for us poor limited humans?

All good programmers need to be part time psychologists, maybe even full time ones.

The reason?

There are many, but let's start with a very obvious one.

What would you say was the most important skill at creating user interfaces?

The Psychology Of Bandwidth

True you have to have the technical skill and equipment to implement your ideas; but without an understanding of how users work then your ideas will be mostly rubbish.

Understanding how users work, in the broadest possible sense, is usually called psychology - hence my opening statement. Rather than launch off into another discussion of user interfaces, an admittedly important topic, the time has come to consider something much more simple - overload.

One of psychology's early phases (there have been many) was based on information or communications theory. It viewed the human processor as an information channel. The idea was that the human wasn't so much a data processing system but a data communications system. The inputs were the senses and the outputs where what ever the human did or perceived as a result of the inputs. 

It studied the channel capacity and the effects of noise on signal processing. There are some theorems relating things like reliability, bandwidth and noise and these are applicable to humans just as much as to networks and radio channels. 

The exact details aren't particularly important, but what is important is the obvious statement that humans are a limited data channel. One of the nicest and simplest results in this area is the 'magic number seven' or Miller's law after the psychologist who discovered it. (See: The Magic Number Seven, Plus or Minus Two.)

 

sevenicon

 

After lots of experiments it was found that, on average, humans can manage to keep in their short term memories seven items of information. The variation observed in the experiments was such that a more accurate statement of memory capacity is seven items plus or minus two - hence the title of the paper referenced above.

You can argue about the fine detail and whether or not the number seven deserves its special position. You can also argue about the limits applying to different cognitive processing but the important point is that there is a limit not its actual value. Humans are bandwidth limited information processing machines. We have a channel capacity or a data rate limit just like any data bus. 

Programming And Seven

What has this got to do with programming?

The answer is quite a lot.

As a programmer you to are subject to the same information processing limits as any lesser mortal. This is the reason why all programming methods endeavour to hide the complexity of a program in a layered structure. You will often hear programmers talking about complexity reduction but actually what is going on is complexity hiding - also known in the wider world as "chunking". 

In modular programming programs are built up out of small chunks of code in the form of subroutines, function or modules. Each module should be small enough to fit into the seven plus or minus two category. Each layer should equally be confined to the same size limits, so that every layer can be easily understood.

There is also a subtle effect caused by the nesting of control structures. One If statement is easy to understand but multiple nested if statements are much harder to untangle. The same is true of loops nested with in loops within ifs and so on. Nested control structures make it difficult to chunk. 

The idea behind chunking is that a human can deal with about seven things at a time but you can work with different levels of abstraction to work with bigger systems. That is you can remember seven letters, or seven words or perhaps seven sentences and certainly seven titles and plots of seven very long books. The principle of chunking applied to programming is to build a hierarchy of code each composed of small objects that can be understood in a single look. 

I have occasionally encountered too literal implementations of the 'seven' part of rule. For example, a project I was involved in had the arbitrary limit of no more than seven lines of active code per subroutine - the project leader was brother to Genghis Khan and a fine warrior, but he applied rules too literally and too brutally.

In some more elaborate design methods such as Yourdon/Demarco you will find people converting guidelines such as 'no more than around seven bubbles in a structure chart' into a similar and exact law.

I have yet to come across an explicit 'seven' rule in object-oriented programming but I guess it's only a matter of time and a deep enough search. 

The point is that seven is a guideline and the actual number depends on all sort of factors but "Keep It Small Stupid" is a much better statement of intent than "Keep It Simple Stupid". 

Well now you know the origin of the magic number seven that crops up so often and you also know that seven could just as well be five or nine.

In fact the situation is much vaguer than even these loose numbers would suggest. The real confounding factor is what exactly constitutes an item that the "seven" applies to?

Should it be seven characters, seven words, seven lines, seven subroutines, seven objects, seven programs....

Clearly the unit used in the rule of seven matters. Basically the unit should always be scaled to fit into the situation so that it is one chunk of comprehension or memory. 

Easy to say but usually much harder to do. 

The Unit Of Comprehension

Focusing on code for a moment it is clear that what matters is the structure of a function or method. 

Do you understand and work with a method as a single thing or a composite unit. Again the rule that each layer should have a limited complexity is justified in terms of the sliding scale of the fundamental unit of comprehension.

We use hierarchies to manage complexity so that each level in the hierarchy is as simple as the next as long as you are working with the correct unit of comprehension. 

There is another more subtle problem.

The unit of comprehension can change with time and who exactly is doing the comprehending.

When you write a small chunk of code you are on top of it all and your unit of comprehension might be very large - it all seems so simple. But when you come back a few weeks later your unit of comprehension has shrunk. Now it would be better if the method consisted of just seven lines of very simple code. 

If there is a lesson to be learned from this aspect of "seven" then it is not to overestimate yourself or your colleagues If you write a nice little code chunk today imagine the lesser mortal  who will have to maintain it.

If you expect this then plan for it!

Make your code easy enough for the dummy to cope with; who knows the dummy may, in fact almost certainly will, turn out to be you. 



Last Updated ( Thursday, 19 March 2020 )