Tell Me Dave - Robots Understand Verbal Commands |
Written by Lucy Black |
Sunday, 29 June 2014 |
Of course, one day robots will have to accept our commands in plain natural language rather then as accurate programs. We all know how vague and messy that could be. Natural language makes sense for us because we know the context and can use this to fill in any missing detail. A robot, on the other hand, isn't intelligent and isn't aware of the context, so when you say "get me a drink of coffee" the robot doesn't know if it is just a matter of pouring some coffee into a cup or brewing coffee from scratch. Now a team from Cornell is working on making a robot respond correctly to English commands and you can help by making use of its simulator to command a robot. The project is whimsically called "Tell Me Dave" which will send a shiver down the spine of any roboticist familiar with the movie 2001 A Space Odyssey. If you would like to see it in action watch a PR2 robot serve up some ice cream. “Take some coffee in a cup. Add ice cream of your choice. Finally add raspberry syrup to the mixture.” We see that this sentence is fairly ambiguous in that it neither specifies which ice cream to take [which depends upon what is available] and nor does it specify all the details like taking a cup with coffee [if one exists] or firstly making coffee and if so how."
Impressive but notice the x35 marker in the top right-hand corner. The ice cream probably melted along the way. This slowness seems to be the curse of most robot demonstrations. Perhaps we do need some radical upgrade to the computers being used. A 3D camera is used to identify objects which the robot has been trained to associate with their capabilities. The associations between natural language commands and plans of action are learned with the help of a simulation. This takes the form of a video game where you can obey a command by creating suitable sets of actions. You can see the method in action in this second video where the robot prepares sweet tea (again notice the speedup factors):
If you would like to help the research then why not try your hand at being a robot and try out the simulation game? These are the sorts of tasks that robots are going to have to solve in order to perform domestic duties and find a place in our homes. It is very much a matter of bringing together a range of existing technologies and solving the problems that occur in making them work together. What is interesting is that it seems likely that you don't need to have full natural language understanding and a deep cognitive model to build a robot slave. Machine learning can associate commands with action plans with a high probability and no understanding is needed.
More InformationTell Me Dave: Context-Sensitive Grounding of Natural Language to Mobile Manipulation Instructions, Dipendra K Misra, Jaeyong Sung, Kevin Lee, Ashutosh Saxena. In Robotics: Science and Systems (RSS), 2014. [PDF]. Synthesizing Manipulation Sequences for Under-Specified Tasks using Unrolled Markov Random Fields. , Jaeyong Sung, Bart Selman, Ashutosh Saxena. In International Conference on Intelligent Robotics and Systems (IROS), 2014. [PDF]
Related ArticlesIs Agile Justin The Most Capable Robot Yet? Pepper - Aldebaran's New Robot Designed To Be Your Friend The Robot Bartender - A Lesson in Why Robotics Is Difficult
To be informed about new articles on I Programmer, install the I Programmer Toolbar, subscribe to the RSS feed, follow us on, Twitter, Facebook, Google+ or Linkedin, or sign up for our weekly newsletter.
Comments
or email your comment to: comments@i-programmer.info
|
Last Updated ( Sunday, 29 June 2014 ) |