Hinton, LeCun and Bengio Receive 2018 Turing Award
Written by Mike James   
Wednesday, 27 March 2019

Often said to be the Nobel Prize of computing, the Turing Award goes this time round to three pioneers of neural networks. The citation refers to the three as "Fathers of the Deep Learning Revolution" but I see one father and and two offspring.

Turingaward

Established in 1966 this award and named for Alan M. Turing, this award is the most prestigious of those made by the ACM (Association for Computing Machinery) and is now worth $1 million dollars, shared between the awardees.

It is presented annually to recognize the contributions of computer scientists and engineers who have:

created the systems and underlying theoretical foundations that have propelled the information technology industry.

There is no question that this year's recipients have pushed forward the boundaries of deep learning. They are responsible for much of the successful AI we are starting to take for granted.

"Working independently and together, Hinton, LeCun and Bengio developed conceptual foundations for the field, identified surprising phenomena through experiments, and contributed engineering advances that demonstrated the practical advantages of deep neural networks. In recent years, deep learning methods have been responsible for astonishing breakthroughs in computer vision, speech recognition, natural language processing, and robotics—among other applications."

All true and LeCun and Bengio have certainly done some groundbreaking work, but Geoffrey Hinton is arguably the only real father that this area of computing can look to. He was applying the back propagation method back in the early 1980s when just about everyone else believed that the whole idea of neural networks was a dead end and a waste of time.

lecun bengio

 

Yan LeCun was a postdoctoral student of Hinton's in the late 1980s and is best known for creating convolutional neural networks. Yoshua Bengio has worked with Hinton on deep learning papers, but he is more of an academic, publishing typically 20 papers a year. Arguably his most important work long term is likely to be on biologically inspired learning methods. The volume of Bengio's output has to be kept in mind when you interpret the figure of 131 citations a day, the most of any computer scientist, closely followed by Hinton's 127 and LeCun's 62.

ghintonuoft

 

When you look at Hinton's work you get a very different picture. He has focused on "big ideas" at every turn, staying true to multilayered neural networks when most were of the opinion that they could never work, as indicated by their poor performance. He then switched to working on Boltzman machines, which are still a theoretical headache today. To make progress, he simplified them to create the Restricted Boltzman machine which led to the auto-encoder and many pre-training methods. Eventually computer power caught up with the demands of deep networks and, with some care, it was possible to train much deeper networks only to discover that this was the solution he had been looking for all along. Since then he has been involved in constructing other more sophisticated architectures based on neural networks  - capsule networks - which may in the long run prove to be the most important of all.

While I am of the opinion that Yann LeCun and Yoshua Bengio have made huge contributions to the subject, Geoffrey Hinton has been working at it for longer and has instigated and shared many of big ideas. When you make a list of AI researchers that has Yann LeCun or Yoshua Bengio on it, there are many other candidates to be included, but if the list starts with Geoffrey Hinton you have entered a different league.

ACM will present the 2018 A.M. Turing Award at its annual Awards Banquet on June 15 in San Francisco, California.

 

More Information

Fathers of the Deep Learning Revolution Receive ACM A.M. Turing Award

Related Articles

RISC Pioneers Gain Turing Award

Turing Award Now Million Dollar Prize

ACM Celebrates 50 Years of Turing Award 

Tim Berners-Lee Awarded Turing Prize

Diffie and Hellman Receive Turing Award 

Geoffrey Hinton Says AI Needs To Start Over

Facebook's Yann LeCun On Everything AI 

Facebook's New AI Lab In Montreal 

Google Expands AI Research In Montreal 

AlphaGo Triumphs In China

Why AlphaGo Changes Everything

Evolution Is Better Than Backprop? 

Why Deep Networks Are Better

The Triumph Of Deep Learning

 

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

 

Banner


.NET 9 Released
18/11/2024

.NET 9 has been released with a number of performance improvements and new features designed to help developers use AI.



JetBrains Improves Kubernetes Support In IDE Upgrades
12/11/2024

JetBrains has improved its IDEs with features to suggest the logical structure of code, to streamline the debugging experience for Kubernetes applications, and provide comprehensive cluster-wide Kuber [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info

 

Last Updated ( Wednesday, 27 March 2019 )