AI robot predicts and smiles at the same time as the other person

The Emo robot can predict a smile about 840 milliseconds before the other person smiles, then smile at the same time .


Robot Emo can predict expressions and smile at the same time as the other person. (Video: New Scientist)

Humans are gradually becoming accustomed to robots that are capable of fluent verbal communication, thanks in part to advances in major language models such as ChatGPT, but their nonverbal communication skills, in particular, is the facial expression, still far behind. Designing a robot that not only shows many facial expressions but also knows how to express them at the right time is extremely difficult.

The Innovative Machines Lab at Columbia University's School of Engineering, USA, has been researching this problem for more than five years. In a new study in the journal Science Robotics , a team of experts here introduced Emo, an AI robot that can predict human facial expressions and perform them at the same time as that person, TechXplore reported on March 27. believe. It predicts a smile about 840 milliseconds before the other person smiles, then smiles at the same time.

Emo is a head-like robot with a face equipped with 26 actuators that allow for a wide range of expressions . The robot head is covered with a soft silicone skin with a magnetic linkage system, making it easy to adjust and quick to maintain. For more vivid interactions, the team integrated high-resolution cameras into the pupils of each eye, allowing Emo to interact with his eyes, which is important in nonverbal communication.

The research team developed two AI models. The first model predicts human facial expressions by analyzing subtle changes in the opposite face, the second model generates motor commands using corresponding expressions.

To train the robot how to express itself, the team placed Emo in front of a camera and let it make random movements. After a few hours, the robot learned the relationship between facial expressions and motor commands - similar to how humans practice expressions when looking in the mirror. The team calls this "self-modelling" - similar to the ability of people to imagine what they would look like when making certain expressions.

Picture 1 of AI robot predicts and smiles at the same time as the other person
 Robots can now integrate facial expressions to respond.

Next, the research team played videos of human facial expressions for Emo to observe each frame. After hours of training , Emo can predict expressions by observing subtle facial changes as a person begins to smile.

"I think accurately predicting human facial expressions is a revolution in the field of human-robot interaction. Previously, robots were not designed to consider human expressions during interaction. Now Here, the robot can integrate facial expressions to respond ," said Yuhang Hu, a doctoral student at the Innovative Machinery Laboratory, a member of the research team.

"The fact that robots make expressions at the same time as humans in real time not only helps improve the quality of interaction but also helps build trust between humans and robots. In the future, when interacting with robots, it will will observe and interpret your facial expressions, just like a real person ," Hu added.