The Trick Of The Mind - Little Languages Arithmetic
Written by Mike James   
Monday, 07 March 2022
Article Index
The Trick Of The Mind - Little Languages Arithmetic
BODMAS
Arithmetic Expression and Programming

Arithmetic Expressions and Programming

Arithmetic expressions are an important part of every programming language. In the early days of computing, converting arithmetic expressions into actions was a big problem for the same reasons that children struggle with them. Clearly the problem is interpreting the expression so as to get the correct order of operations. The arithmetic expression may not be written in the order in which it has to be carried out, but to get the result you have to convert it into a set of operations carried out one after the other.

The way that the different operations have different priorities makes it difficult to convert an arithmetic expression into a simple sequence of instructions that perform the computation. Returning to our example:

1+8*2/4-3

the sequence of instructions needed to perform this computation is:

multiply 8 by 2
divide the result by 4
add 1 to the result
subtract 3 from the result

The whole process of working out an arithmetic expression is actually about converting the expression into a sequence of operations that are in the correct order and this order is not the order that the operations are written in the arithmetic expression.

A human has to do this job to get the right answer and so does a computer if it is to automate the process. We are so used to the idea that we can give a computer an expression like 1+8*2/4-3 and get an almost instant answer that we tend to not notice that this isn’t a trivial matter of doing the sums. The computer has to be able to convert the expression into a set of instructions that are not necessarily in the same order as the operations in the arithmetic expression – and in general this is a very hard problem.

In the early days of computers the problem was solved by getting humans to do the job. Early programmers were given arithmetic expressions and had to convert them to sequences of instructions that gave the correct answer. They had to take the expression and reduce it to the sort of list of instructions given above. This was time-consuming and error-prone and the obvious solution was to get the computer to do the job, but this proved to be difficult, very difficult.

Some computer languages, the best known is Cobol, avoided the problem by getting the programmer to write all arithmetic as a sequences of operations that were in the correct order in the first place, but this was just as inefficient. It worked for most financial calculations where the arithmetic expressions were simple, such as “multiply price by tax rate to get tax amount”, but for scientific work it was clearly a costly burden.

Eventually programmers worked out how to convert any arithmetic expression into the correct sequence of operations and FORTRAN, the first modern computer language was born. The name FORTRAN stands for FORmula TRANslation, so you can see that this conversion of expressions, i.e. mathematical formulas, was really at its core. It was created by a team led by John W. Backus at IBM and was an immediate success, leading on to all of the computer languages we use today. FORTRAN was released to the world in the early 1950s and this seems a very short time ago for something so fundamental to the use of computers.

You may have noticed that we have slipped in an idea that deserves more comment. FORTRAN is a computer language that allows general arithmetic expressions as commands and somehow these are converted from their non-sequential order into a set of sequential instructions for computation. Notice also that it is the computer that does this conversion and in turn this means that it is a program that actually does this conversion. This program is generally called a compiler because it compiles a list of instructions needed to implement the arithmetic expressions and anything else in the FORTRAN program into a set of sequential instructions. The task that a human once performed, i.e. the conversion of arithmetic expressions into instructions, is now done by a computer – a very commonly repeated story.

Trick360

Why Are Arithmetic Expressions Like They Are?

You don’t really need to know why arithmetic expressions use priorities to express calculations, but it is interesting. There is a sense in which arithmetic is written as it is because mathematicians just decided that it should be. If it is a human convention that makes things difficult for students and computers alike, why not just change it? The answer is that the reason for writing arithmetic in the way that we do is deeper than just doing calculations. It isn’t entirely arbitrary and it has huge benefits when things become a little more advanced.

To simplify this discussion, let’s get rid of subtraction and division – they don’t really exists. Subtraction is just the addition of a negative quantity. That is 2-1 is the same as 2+(-1). Division is just multiplication by an inverse. That is 2/3 is the same as 2*(1/3).

Now consider an expression like:

3*2+3*4 = 6 +12 = 18

this gives the same result as:

3*(2+4)  = 3*6 = 18

We have noticed that we are multiplying both the 2 and the 4 by 3 and we have opted to do the addition first and save ourselves a multiplication by “pulling the multiply out of the bracket”.

That a*(b+c) is the same as a*b+a*c is called the distributive law of multiplication and it is the reason we prefer to think of multiplication as having a higher priority than addition. If this wasn’t the case and we evaluated 3*2+3*4 strictly left-to-right:

3*2+3*4 = 6 + 3 * 4 = 9 * 4 = 36

then the distributive law wouldn’t be true.

To make it true in this left-to-right world you would have to write:

a*(b+c)= a*b + (a*c) 

which isn’t as pretty. More to the point, it isn’t a good way to think about it and if arithmetic expressions worked in this way mathematicians would find simplifying expressions and doing algebra much less natural. So millions of students have suffered BODMAS simply to make algebra easier for mathematicians.

In other words, if you want to think of:

a*b + a*c =  a*(b+c) 

as “pulling a out into a bracket” then you need to make multiplication a higher priority operation than addition.

In chapter but not in this extract

  • Regular Expressions Little Language
  • Why A “Little” Language?

Summary

  • Little languages often occur as part of complete computer languages and they are interesting to find out about. Noticing that something is a little language is often all you need to realize how to master it.

  • Arithmetic is a sophisticated little language and its priority rules are the reason so many students struggle to understand it.

  • Rules like BODMAS, especially brackets, help get arithmetic right.

  • In the early days of programming it wasn’t obvious how to convert an arithmetic expression into an equivalent set of instructions. The first computer language to do this was FORTRAN in the late 1950s.

  • The reason for the complicated priority system is so as to make the distributive law easy to work with.

  • Regular expressions are another very common little language found in most big programming languages.

  • A regular expression is a specification for a target string that you are looking for in a larger string.

  • Quantifiers *, + and ? are used to control the number of times a character can match.

  • Regular expressions can become very complex and difficult to understand.

  • They form a little language because they do not allow details of earlier matches to affect later matches and this means that there are patterns for which it is impossible to create regular expressions.

 

Related Articles

Grammar and Torture

The Computer - What's The Big Idea?

The Essence Of Programming

The Trick Of The Mind - Programming & ComputationalThought

Buy Now From Amazon

Trick360

Chapter List

  1. The Trick Of The Mind

  2. Little Languages
       Extract: Little Languages Arithmetic

  3. Big Languages Are Turing Complete

  4. The Strange Incident of The Goto Considered Harmful
       Extract: The Goto Considered Harmful

  5. On Being Variable  

  6. Representation

  7. The Loop Zoo
       Extract The Loop Zoo
      
    Extract Advanced Loops

  8. Modules, Subroutines, Procedures and Functions
       Extract Modular Programming

  9. Top-Down Programming 

  10. Algorithms
       Extract: Binary Search 
       Extract: Recursion ***NEW!!

  11. The Scientific Method As Debugging 

  12. The Object Of It All
       Extract Why Objects 

 <ASIN:1871962722>

<ASIN:B09MDL5J1S>

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


Azul Outperforms OpenJDK By Up To 37%
23/10/2024

Azul has announced that its Azul Platform Prime outperforms comparable OpenJDK distributions by as much as 37%. The company has also launched the Azul Java Performance Engineering Lab (JPEL) aimed at  [ ... ]



Google Releases Gemini Code Assist Enterprise
16/10/2024

Google has released the enterprise version of Gemini Code Assist. This latest version adds the ability to train on internal polices and source code. The product was announced at the Google Cloud Summi [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info



Last Updated ( Monday, 07 March 2022 )