This new AI can guess people's feelings through gait

Are you happy, sad, or angry? Scientists are teaching algorithms to feel emotions that just look at the gait!

Our emotions can affect everything from appetite to the way we feel the world - and even the way we walk.

So can we interpret someone's feelings just based on their gait? That's exactly what scientists at the University of North Carolina at Chapel Hill and the University of Maryland at College Park have been teaching computers to do. Using deep learning , their software can analyze a video of someone walking, transforming it into a 3D model and extracting that person's gait. After that, a neural network will determine the dominant movement and pair it with a specific type of emotion, based on the data it has previously trained. According to researchers, this deep learning model can predict 4 different emotions - happy, sad, angry and normal - with an accurate rate of up to 80%.

Picture 1 of This new AI can guess people's feelings through gait Photo 1 of This new AI can guess people's feelings through gait
This deep learning model can predict 4 different emotions - happy, sad, angry and normal.

Although we have seen many AIs trained to predict human emotions based on facial expressions or voices, this is the first time an algorithm has been trained to accurately predict emotions just by watching. people walk.

Aniket Bera, research supervisor and professor of computer science at the University of Maryland, said their research is not about finding ways to detect actual emotions, but rather predicting feelings of comprehension. - the feeling of a person in the eyes of the opposite, like we anticipate each person's feelings when meeting each other everyday. According to Bera, through research, can teach robots how to guess what people around them are thinking, and change their behavior accordingly. Or vice versa: it can help engineers design robots that can communicate better with their gaits and body movements.

Bera added that this study will help pave the way for future surveillance applications, or even help turn the mixed reality experience more appealing, by the 3D models of people walking. Specific ways can help design more realistic characters.

But the team concluded that the next step would be to focus on other actions, not just walking, such as running, and gestures, to understand the smallest emotions we express. Reach when moving.