An Equation For Intelligence
Written by Mike James   
Sunday, 09 February 2014

 It is something like the philosopher's stone. A single equation for intelligence. A sort of E=mcthat would put intelligence, and more particularly artificial intelligence, on a sound theoretical footing. But could it be as simple as this TED talk video suggests?

 

Alex Wissner-Gross thinks that he has worked out the physical basis for intelligence. He has devised an equation that in his opinion tells every system how to behave in a way that we would label intelligent. His equation is easy enough to write down:

 equation

 The S is related to the entropy of the system and the T is a notional "temperature". If you know math then the triangle symbol will be familiar to you as the gradient and the equation roughly says that there is force in the direction of increasing entropy. This force - the causal entropic force - is what is equated to intelligence. 

If you have encountered the ideas of thermodynamics or information theory then you may already know that entropy and information are intertwined in ways that are very complicated once you go beyond a general ideas of order and disorder. So it isn't really surprising that entropy would figure in any theory of intelligence. This isn't the first time that "thinkers" have linked the two together, but mostly the connection has been stated in vague, almost philosophical, ways.

Now we have an exact equation. 

If you read the paper that explains how it all works then you will realize that the exact equation embodies a philosophical principle. This states that intelligence is behaviour that is motivated by the need to keep as many options open. It attempts to reach states that maximise the freedom to act. That is, if you build a system that moves its state in the direction of the causal entropic force the system will move towards a state that maximizes the causal entropy. 

Of course, what is hidden in all of this is the exact definition of causal entropy - because this isn't the same as the usual entropy of a system.

Causal entropy is a path integral of the probability of a system evolving from it current state to new states.

If you examine the formulation more carefully then you will notice that in fact it is the information content of a path leading from the current state to a future state that is integrated over all paths. The causal entropic force causes the system to evolve towards the state that maximizes this integral, i.e. a state with lots of highly probable future states. 

The video explains some of this and provides examples of the principle in action where it is claimed to replicate a number of "human-like" intelligent behaviours including cooperation and tool use. 

 

 

Impressed?

There are a lot of questions you should be asking. The examples given are of contrived systems not natural ones. No cart spontaneously balances a pendulum and particles do not move to the center of a box and so on. There is no evidence that maximizing causal entropy has any correspondence to real physical systems.  

In addition all of the examples are of continuous dynamics over-interpreted into something more meaningful to the viewer. An ape with a tool trying to get some food isn't anything like a few disks bouncing around a box. The ape is motivated by a huge range of things and not just an innate desire to maximize its future freedom to act. True, eating some food does give the ape an increased freedom to act, but one grape or grub say is hardy going to setup a gradient down which intelligence flows. 

It could be that some measure connected to entropy is related to intelligent behaviour. Recent work has suggested, for example, a connection between entropy, reversibility and why things spontaneously organize into reproductive units when there is abundant energy. This equation of life seems to have a much better chance of being true than the equation of intelligence. 

The trouble with AI is that it is too easy to make something trivial look sophisticated if you put the right spin on it - and the right spin usually turns out to be physics.

 
equationicon

More Information

Causal Entropic Forces

A New Physics Theory of Life

Related Articles

The Virtual Evolution Of Walking

Never Ending Image Learner Sees, Understands, Learns

Deep Learning Applied To Natural Language

The Truth About Spaun - The Brain Simulation

 

To be informed about new articles on I Programmer, install the I Programmer Toolbar, subscribe to the RSS feed, follow us on, Twitter, FacebookGoogle+ or Linkedin,  or sign up for our weekly newsletter.

 

espbook

 

Comments




or email your comment to: comments@i-programmer.info

 

Banner


PHP 8.4 Adds Property Hooks
26/11/2024

PHP 8.4 is available with improvements including property hooks, asymmetric visibility, and an updated DOM API.



Use Javascriptmas To Hone Your Webdev Skills
08/12/2024

Every day until December 24th MDN, in partnership with Scrimba, is releasing a daily challenge, which as the name suggests requires you to practice your JavaScript skills. Each solution you submi [ ... ]


More News

 

 

Last Updated ( Sunday, 09 February 2014 )