Robots Will Understand Your Emotions Better

Robots could be taught to recognize human emotions from our movements, a new study shows. Researchers found humans could recognize excitement, sadness, aggression, and boredom from the way people moved, even if they could not see their facial expressions or hear their voice.
The study was conducted by researchers from Warwick Business School, University of Plymouth, Donders Centre for Cognition at Radboud University in the Netherlands, and the Bristol Robotics Lab.
The study published in the journal Frontiers in Robotics and AI, robots could be taught to recognize human emotions from our movements. As part of this exercise, a team of psychologists and computer scientists filmed pairs of children playing with a robot and a computer built into a table with a touchscreen top. Some participants watched the original videos.
A second group saw the footage reduced to “stickman” figures that showed exactly the same movements. Members of both groups agreed on the same emotional label for the children, more often than would be expected if they were guessing.
The researchers then trained a machine-learning algorithm to label the clips, identifying the type of social interaction, the emotions on display, and the strength of each child’s internal state, allowing it to compare which child felt more sad or excited.
Dr. Charlotte Edmunds, of Warwick Business School, said: “One of the main goals in the field of human-robot interaction is to create machines that can recognize human emotions and respond accordingly.
“Our results suggest it is reasonable to expect a machine learning algorithm, and consequently a robot, to recognize a range of emotions and social interactions using movements, poses, and facial expressions. The potential applications are huge.”
Madeleine Bartlett, of the University of Plymouth, said: “It is important that a social robot can recognize human internal states, such as stress, and their strength or severity.
“Different levels of stress require different responses. Low-level stress might just require the robot to back away from the human, while a high level of stress might be best addressed by having the robot apologize and leave the interaction.”
Their findings suggest robots could learn to use the same movements, alongside facial expressions and tone of voice, to recognize human internal states. It raises the prospect that robots, already used to teach second languages, could recognize when students are bored, and customer service robots could identify when people feel angry or stressed.
In addition to helping humans, the study also provides a medium of connection between the two. It can help scientists to create a robot that can react to human emotions in difficult situations and get itself out of trouble without having to be monitored or told what to do.

Comment here