Now we have a complete picture of what happens during the execute phase and we can even add to the description the necessary delays while everything settles down. Everything happens at times determined by the system or processor clock. Exactly when everything happens varies according to the particular type of processor but usually the rising and falling edges of the clock pulse are used to mark the moment things happen.
For a register load instruction that has just been placed in the instruction register by the previous fetch this would be:
On the rising edge of the execute clock pulse the register select bits are decoded and one of the register select lines is activated. The “action” part of the op-code is decoded and one of the register control lines – load in this case- is activated. The address portion on the instruction drives the address bus. Notice that all of this happens at the same time as there are separate logic gates for each part of the action.
During the clock pulse the memory decodes the address bus and the addressed location drives the data bus – and everything is given time to settle down.
On the falling edge of the clock pulse the selected register latches the data bus.
And, of course, after the execute phase there follows another fetch and so on until the program is complete.
This is the basic principle of the computer and the way that the CPU works.
You don’t need any more hardware or additional operating principles to make a machine that does most of the things you need.
For example, to add two numeric values you don’t need a special instruction that adds two memory locations together. All you need is the instruction that loads the A register and one that adds the contents of a memory location to the A register’s current contents. Notice that you don't need an additional sort of clock cycle in addition to the fetch and execute cycles. The add hardware is implemented so that instead of loading the register with the contents of the data bus the value on the data bus is added to the register.
The operation of repeatedly adding values to the A register, i.e. “accumulating” a sum, is where the A register derives its name from.
The extra hardware needed to make the A register into an accumulator is simply a full adder that adds the value on the data bus to the value on the output of the A register. The output of the full adder is fed back into the input of the A register.
How to accumulate
If you don’t want to make things complicated you can even use the same hardware arrangement to implement a “Load A” operation by simply blocking the output of the A register during an “Add to A” operation. In practice, though, a modern processor will have a number of internal buses connecting the registers to the Arithmetic and Logic Unit or ALU – of which our full adder is just the beginning.
Question: what does a full adder want to be when it grows up?
Answer: an ALU…
Finally just to show you that everything you could possibly want can be easily included in this simple processor architecture consider how you might implement a “jump to xxxx” instruction.
Normally the next instruction to be executed is in the next sequential memory location but a jump makes the instruction at xxxx the next one.
How can this be achieved? Easy!
Just make the PC register correspond to register address 00 (rather than general purpose register D as suggested earlier). Now consider what "load PC from address aaaa” does. It loads the PC register with the value stored in “aaaa” and so makes this the next instruction. Not quite what was required but it isn’t difficult see how it can be modified to make it work exactly as specified.
But this brings us to the interesting topic of addressing modes and that’s another story.
Testing to see if a number is a prime or not is the basis of many encryption and security methods. It has long been assumed that there is no fast way, i.e no polynomial time method, to determine if a [ ... ]
This xkcd cartoon provides an ideal excuse to explain Kolmogorov complexity. It is an interesting topic and one that gets right to the heart of programming of how programming relates to ideas like inf [ ... ]