Spot With AI - The New Robotics
Written by Harry Fairhead   
Sunday, 29 December 2024

It was just last week I complained that despite the advances in AI robots seems to be just as stupid as ever. Now  I have found a video of Spot embracing AI in a way that is a step on the way to the future of robotics and, indeed, AI.

Until recently most robotics companies weren't really AI companies. Robots did what they did because of impressive applications of basic physics, mechanics if you like. Inverse kinematics was used to work out the paths that parts of a robotic limb would have to take to move from one point to another. Complex control theory equations were solved to allow a robot to balance on however many feet it happened to have.

Robotics has previously relied on the application of numerical modeling of physical systems. There were exceptions, of course, but this observation is largely true and as a result robots have mostly not been autonomous. Many were either piloted in real time by a user or ran stored procedures that steered them through a set of behaviors. When you look at the amazing dance routines that robots like Atlas have engaged in, you are mostly watching the choreography of a directing programmer. If anything in the environment changed then the robot would mostly go on as if nothing had happened. Of course, many robots engineeers introduced sensitivity to the local environment - obstacle avoidance and path selection and so on, but nothing too sophisticated.

So where is the AI in robotics?

It seems that it is on it's way, but companies with so much invested in classical deterministic behavior - robot engineering - are going to be slow to give up hard-won territory to replace it by opaque AI algorithms.  The algorithms are opaque because it is in the nature of neural networks that it can be difficult to explain why they do what they do. 

It seems that Boston Dynamics is starting to wake up to AI. A team is investigating ways of incorporating deep neural networks into the processing of its sensors. A pretrained model is used to process the data and recognize potential hazards. The result is that we have now moved on from an pure engineering approach of "don't move into a space that is occupied" to one that allows a custom reaction to each object that is identified:

Moving in the right direction, but to be honest it's disappointingly slow and being a for-profit company, there's no suggestion as to what foundational model was used and, of course, no open source access to the code.

This is the least you can expect for AI in robots. An end-to-end approach with an neural network processing the sensors and driving the actuators is the goal but it seems that this is still a way off. What Boston Dynamics seems to be doing is bolting on the "new AI" to the existing robot engineering. That is they are using the output of the vision model to identify objects and then resorting to a simple algorithm to decided what to do. I guess you could call it AI-augmented robot engineering.

spotavoid

More Information

Put It in Context with Visual Foundation Models

Related Articles

Robot Xmas 2024

GR00T Could Be The Robot You Have Always Wanted

ROScribe - LLMs At The Service Of Robotics

30 Years of Boston Dynamics

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

 

Banner


The Art Of Computer Programming - A Great Present
15/12/2024

If you are looking for a programmer present this holiday season, there is one book, or set of books, that should be top of any list... Donald Knuth's The Art of Computer Programming.



Santa Is On His Way
24/12/2024

Around the world children are eagerly awaiting Santa - which is something of a problem since he'll only arrive when they are fast asleep. If you want to know when he'll arrive, track Santa's progress  [ ... ]


More News

espbook

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Sunday, 29 December 2024 )