AI Learns To Program Super Mario Bros |
Written by Mike James | |||
Wednesday, 13 September 2017 | |||
This sounds like another thing to be worried about, if that is, you are worried about AI taking over the world. We might be getting used to AI learning to beat humans at arcade games and even Chess and Go, but now it is learning to program games just by watching. This is interesting and has practical applications, but it's not the same sort of AI breakthrough that we have been getting used to with neural networks of all types, even though some reports would have us think so. Research presented at the International Joint Conference on Artificial Intelligence, Aug. 19-25, in Melbourne, Australia relates to a much more structured approach and it relies on pre-engineered frameworks. Before neural networks took off this is what AI was like and there is no reason to dump this sort of approach. For many tasks it is actually better. For one thing, it learns a lot faster than generalized AI approaches that involve long and costly training. What do I mean by "pre-engineered" frameworks? Well in this case the program was given a set of sprite images that it could use to find sprites within video footage of the game being played. It was then able to deduce the rules by which the sprite moves. Basically it inferred a set of if..then style rules for the motion of each sprite. This provides a framework for the environment that would have taken a lot of time and training to learn from scratch. As a result, the AI learned the game in less than 2 minutes watching a video of the game that was played from the start to end. According to Matthew Guzdial, lead researcher and Ph.D. student in computer science at the Entertainment Intelligence Lab at Georgia Tech: “Our AI creates the predictive model without ever accessing the game’s code, and makes significantly more accurate future event predictions than those of convolutional neural networks. A single video won’t produce a perfect clone of the game engine, but by training the AI on just a few additional videos you get something that’s pretty close.” Notice that while the rules were derived from a video, when applied to a game engine they produced a version of the game that could be played and was judged to be a good copy.
While the achievement is important and useful it isn't quite as spectacular as you might be led to believe. This isn't a neural network being shown a video game and it learning to construct the program in sone sense. The team did try a neural network on the same task of predicting the next frame in the video sequence and they had to reduce the resolution and still it didn't do too well. This illustrates the usefulness of the method. If you can put structure into the solution of an AI-like problem, then it can result in something that works much better.
More InformationGame Engine Learning from Video by Matthew Guzdial, Boyang Li, and Mark Riedl. Related ArticlesAI and Games Pioneer, A L Samuel The Malmo Challenge - Collaborative AI Google's DeepMind Learns To Play Arcade Games To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
Comments
or email your comment to: comments@i-programmer.info
|
|||
Last Updated ( Wednesday, 13 September 2017 ) |