There is a problem to be solved before robots can be entrusted to look after children and the elderly - how to establish a bi-directional relationship with them. The new science of "lovotics" sets out to explore this.
Humans have had a long-time relationship with inanimate objects - just think about a cuddly toy. In fact the object doesn't have to be soft and cuddly but it helps. Now we have a new subject, "Lovotics" proposed by Hooman Samani, an artificial intelligence researcher at the Social Robotics Lab of the National University of Singapore, which aims to engineer the love we feel for robots and perhaps just as important find way for robots to express love back.
The key here is that the robot is intended to take an active role in promoting the love - as a bidirectional interaction.
"Even though various fields have proposed ideas about the role and function of love, the current understanding about love is still quite limited. Furthermore, developing an affection system similar to that of the human being presents considerable technological challenges."
The idea is to formalize the complex system that controls how humans feel towards one another - their emotions, reasoning and even their endocrine system. From this you can attempt to build robots that humans can love and be loved back by.
This is not a simple interaction and so far the robots seem to have displayed jealousy and a constant demand to be stroked by their human keepers. They run around and tweet and twitter like birds - its all a bit like a determined attempt to be cute.
Before building the robot a survey revealed that 19% thought that they could love a robot and surprisingly 36% thought they could be loved by a robot. This suggests that unrequited love is going to be a real problem. More seriously it probably reveals an over optimism about what robots are capable of based on an ignorance of the technology.
There are serious issues here but they are mostly centered on the problems of getting a humanoid robot accurate enough to avoid the "uncanny valley" effect - where small errors of behaviour or appearance become creepy rather than endearing. In this case we have small "tribble-like "robots which mimic the role of a pet rather than another human. There is also the argument that humans are suckers enough for "cute" without the device exhibiting reactive behaviours.
Results of the first ever Winograd Schema Challenge were unveiled recently at the International Joint Conference on Artificial Intelligence (IJCAI-2016). The low scores reflect how far there is [ ... ]