CPU
Written by Harry Fairhead   
Article Index
CPU
The Execute Cycle
The Fetch-Execute Cycle

The real complexity of any computer system resides in the processor, but do you know how it works? It isn't difficult - just a matter of "fetch" and "execute".

The All Important Processor

What is the most important part of any computer?

Memory and disk is usually dismissed with a simple – “how much?”

The processor is quite another level of difficulty. Mostly we drop down into “simple mode” when discussing the CPU and quote how fast it operates. Without really knowing what “GHz” means and without even being sure of what the processor does, we compare abstract numbers certain that bigger is better, well probably, Today we also talk about the number of "cores" a processor has - but this is really a sign of failure. Some time ago the chip industry ran out of steam and could no longer find a way to increase the speed that processors worked at. They topped out at around 3Ghz and this is where processor speed has been stuck for some time. However they could get more and more processing power onto the silicon they had available. The big question was and is - what to do with it? The simple answer is to put multiple copies of the basic processor design on it to create a multi-processor or multi-core chip.

All-in-all it is the processor that makes the computer what it is.

The processor is the computer

There really is no question of the validity of this assertion.

If you don’t believe me try running a program written for a PC on a Mac.

The point is that computers with different processors are different – computers with the same processor are just faster or slower.

The details of memory management and caching my be impressive but the real complexity of any computer system resides in the processor and it is time to look more closely at how it does what it does.

Even if you think you already know you still might find the explanation interesting. The reason is that many books and courses don’t really tell you the whole story. They stop short and leave you with a sense that the processor is somehow magic even though you know the rough outline of how it all should work.

 

Banner


Elsewhere we have  discovered that what makes a computer is the intimate connection between processor and memory. When the processor places an address on the address bus a particular memory location is selected and either stores the data on the data bus or places the data stored in the location on the data bus.

Notice that this isn't magic. There isn't a little humanoid that goes and finds a particular memory location by address and then retrieves the contents for the CPU. The action is as automatic as a key in a lock. The CPU puts the address on the address bus and this selects and activates a particular memory location. The read/write line sets the memory location's behavior and it either places its content on the data bus or it "latches" or stores the contents of the data bus.

All automatic.

Program Counter

This might well be the major operating principle of a computer but it leaves out what the processor actually “does” with the data.

After all it is called a “processor” so presumably it doesn’t just store and retrieve bit patterns. We already know how binary patterns can be used to represent numbers and we know how Boolean logic can be used to manipulate them – with addition and subtraction.

But this is only part of what goes on. When you first start to consider the workings of the processor it is usually arithmetic that the focus falls on. The reason is that we often, mistakenly, think of computers as “computers” but for the vast majority of the time a computer is actually doing something other than arithmetic.

Once you start looking a little more closely the magic seems to be more to do with how this lump of silicon, or whatever it is made from, can obey the commands in a program. How on earth does it look at the next instruction in a program, work out what it means and then arrange the immutable hardware to do it?

Once again there is a tendency to think of a little humanoid sitting where the processor is waiting for the next instruction to appear and then doing whatever it commands. This is, of course not how it happens and it is all just as automatic as the memory storage and retrieval.

The “trick” that the processor performs seems very complex but it is all based on building the complex from the simple and the very regular – but isn’t this always the principle when it comes to computers?

The first thing a processor needs is some way of keeping track of where it has reached in the program. This is using a single internal memory location, usually called the “Program Counter” or PC – although don’t be fooled by its name into thinking that it counts programs! All internal memory locations within the processor are called “registers” for historical reasons and to indicate that they are generally just a little more than simple memory locations. For example, the PC register has two operations that it can perform. It can be initialized to a set value and it can be incremented, i.e. it can add one to the value stored in it.

The Fetch cycle

As well as the PC register, a processor also has an instruction register which is used to store the current program instruction. A system clock generator provides pulses that synchronize what happens in the entire machine – it doesn’t have to be this way but non-clock synchronized machines are much more difficult to build and, by for the same reason, to describe! What happens is that the PC register contains the address of the next instruction and on the first clock pulse this address is placed on the address bus and a read cycle transfers the instruction into the instruction register. This is called the Fetch cycle.


The PC register points at the location which holds the next instruction

If you want a more detailed description of the fetch cycle you also have to include delays that are necessary for everything to settle down. So the complete fetch cycle might be something like:

  1. On the rising edge of the clock pulse the PC register drives the address bus and the instruction register is set to read whatever appears on the data bus - i.e the next instruction. However this doesn't appear on the data bus immediately as it takes time for the memory to respond to the new address.

  2. During the clock pulse the address decoder selects the RAM location that is addressed. The fact that the read/write line is set to read means that the memory location automatically places its contents on the data bus

  3. On the falling edge of the clock pulse the instruction register latches whatever is on the data bus and the PC register adds one to its contents.

Notice that the fetch cycle is always the same and nothing ever varies, i.e. it is easy to implement this using nothing but logic gates. Let the PC drive the address bus, wait a while for everything to settle and let the instruction register latch what is on the data bas - easy! 

Once the instruction has been loaded into the instruction register the PC register is automatically incremented by one. This makes sure that at the start of the next fetch cycle the very next instruction is “fetched” and the program progresses from beginning to end.

So far so good, but what happens to the instruction that is in the instruction register?



 
 

   
RSS feed of all content
I Programmer - full contents
Copyright © 2014 i-programmer.info. All Rights Reserved.
Joomla! is Free Software released under the GNU/GPL License.