Geoffrey Hinton Shares Nobel Prize For Physics 2024 |
Written by Sue Gee | |||
Tuesday, 08 October 2024 | |||
with John Hopfield, for "foundational discoveries and inventions that enable machine learning with artificial neural networks." Pictured on the left John Joseph Hopfield, born July 15 1933 in Chicago, USA, is emeritus professor of Princeton University, He is receiving the Nobel Prize for Physics 2024 for his study of associative neural networks, in particular for the development of the Hopfield network, something he did in 1982 while at Caltech. Pictured on the right Geoffrey Everest Hinton, born December 6 1947 in London, UK is currently affiliated to the University of Toronto, Canada. His share of Nobel Prize for Physics is in recognition of the way he used the Hopfield network as the foundation for a new more general network: the Boltzmann machine, which can be used to classify images or create new examples of the type of pattern on which it was trained. While this work dates back to 1985, Hinton continued to build on it, helping initiate the current explosive development of machine learning. Explaining the decision to award Hopfield and Hinton the Nobel Prize, Ellen Moons, Chair of the Nobel Committee for Physics, said: “The laureates’ work has already been of the greatest benefit. In physics we use artificial neural networks in a vast range of areas, such as developing new materials with specific properties.” Commenting on the news Hinton said that he was "flabbergasted" and that he hadn't been aware that he had been nominated. Hinton, who is often referred to as the "godfather of artificial intelligence" is no stranger to winning prizes. In 2020, when he was awarded the Royal Medal by the Royal Society for his work on artificial neural networks, our report listed other prestigious prizes he had already won, including the ACM Turing Award, often referred to as the Nobel Prize for Computing, which he shared with Yann LeCun and Yoshua Bengio. This time however it is the real Nobel Prize, with the sum of 11 million kroner (around $1.1million) awarded by the Royal Swedish Academy of Sciences and it is in Physics. His co-recipient is John Hopfield, who may currently not have as prominent a profile as Hinton, but who is in fact the lead prize winner. According to the Nobel Prize Press Release: This year’s two Nobel Laureates in Physics have used tools from physics to develop methods that are the foundation of today’s powerful machine learning. John Hopfield created an associative memory that can store and reconstruct images and other types of patterns in data. Geoffrey Hinton invented a method that can autonomously find properties in data, and so perform tasks such as identifying specific elements in pictures. The full story of how the technology underpinning GenAI, machine learning using artificial neural networks, was inspired by the structure of the brain is expanded on in an article titled, They used physics to find patterns in information, which you should read if you want to understand how the Hopfield Network, proposed in 1982 to model associative memory was expanded using ideas from statistical physics by Hinton and his colleague, , Terrence Sejnowski in 1985. This new network, called a Boltzmann machine because it utilised an equation put forward by the nineteenth-century physicist Ludwig Boltzmann, could learn by being given examples. Hinton was initially a champion of the standard feed-forward network trained using backpropagation. He was a voice in the wilderness back then as most people had given up on neural networks as they were unfashionable and considered "fringe" activities akin to investigating the paranormal. Eventually even Hinton gave up and decided that Hopfield's network might be a better approach if it could be generalized and the Boltzmann machine was born. This proved difficult to train, but after much experimentation led to the restricted Boltzmann machine - not as powerful, but easier to train. The problem with the orignal feed-forward network was that the computer power of the time was not enough to train more than a few layers of tens to hundreds of neurons. The breakthrough came when the power increased enough to allow deep and large networks. Hinton's research moved back to feed-forward networks, but now with many layers and lots of neurons - deep learning. He remarked that he had the solution all along, but didn't have the computer power to prove it. Today, it is the deep neural network that drives our AI and the Hopfield and Boltzmann networks are something of an intriguing curiosity. More InformationThey used physics to find patterns in information (pdf) Related ArticlesGeoffrey Hinton Awarded Royal Society's Premier Medal Hinton, LeCun and Bengio Receive 2018 Turing Award Geoffrey Hinton Says AI Needs To Start Over Neural Turing Machines Learn Their Algorithms To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
Comments
or email your comment to: comments@i-programmer.info |
|||
Last Updated ( Tuesday, 08 October 2024 ) |