Nao Plays Music Like A Human |
Written by Lucy Black | |||
Sunday, 02 November 2014 | |||
You may have seen lots of cute videos of Nao playing a musical instrument, but these have mostly been closed loop - that is, if you took the instrument away Nao would carry on playing it. In this video things are different; Nao sees and hears the instrument and adjusts his playing to get the best sound. This is one of the problems with robotics - you can make them look more capable than they really are. Take a humanoid robot like Nao and put a percussion instrument in front of it. All you have to do to get a tune out of the robot is to program a set of absolute pre-computed movements - for example, see Nao Plays Jingle Bells. Each note will be struck as well as you programmed each movement. If something changes, like the removal of the instrument, then the robot just carries on but striking the air and making no sound. Until you know this, however, it really does look as if Nao is playing the instrument in something like the way a human player might. In this case you would be correct in making that assumption.
The robot makes use of its onboard cameras to do the image processing needed to localize each bar of the metallophone. Then inverse kinematics are used to compute the movements required to get from where the arm holding the hammer is to the correct bar. Notice that you not only need position but a final velocity so that the note is struck correctly. Given the note sequenece needed the inverse kinematics computes a good joint movement set to play the entire tune.
A nice additional feature is that the robot's microphone is used to monitor the quality of the sound produced. If the sound is too quiet then next time it hits harder and so on. it even checks that the frequency is correct in case it is out of alignment with the instrument and has hit the wrong note. What this means is that for the first time you could program Nao to play a tune just by providing a score and expect it to be played on any instrument of the type, not just the specific example that was used in the demo. Take a look at the video for the finer points of what is going on:
It seems that the movements are precomputed before the tune is played and if you noticed the x4 and x16 speeds at various points this obviously takes some time. You might not want to wait around while Nao works out how best to play a long piece of music. However, this is all a matter of speed and speed is all a matter of the right hardware (well nearly always). In principle, you could give Nao a score and it could be played in a human way. What next - other instruments obviously. Although for some instruments, a piano performance say, Nao is going to need a few more fingers. If you want to know more, the work, by the University of Freiburg's Human Robots Lab, will appear in the forthcoming Humanoids 2014 conference.
More InformationRelated Articles
To be informed about new articles on I Programmer, install the I Programmer Toolbar, subscribe to the RSS feed, follow us on, Twitter, Facebook, Google+ or Linkedin, or sign up for our weekly newsletter.
Comments
or email your comment to: comments@i-programmer.info
|
|||
Last Updated ( Sunday, 02 November 2014 ) |