Teaching robots to make human facial expressions

Researchers at Columbia Engineering have used Artificial Intelligence to teach robots how to make appropriate reactive human facial expressions

Facial expressions play a key role in building trust, but currently, most robots have not mastered our expressions. With the ever-increasing utilization of robots in locations where robots and humans must work harmoniously, from nursing homes to warehouses and factories, there is a growing necessity for a more responsive, facially realistic robot, as the ability for robots to imitate human facial expressions could build trust between humans and their robotic co-workers and care-givers.

For the last five years, scientists in the Creative Machines Lab at Columbia Engineering have been developing EVA, a new autonomous robot with a soft and expressive face that reacts to match the expressions of nearby humans.

“The idea for EVA took shape a few years ago when my students and I began to notice that the robots in our lab were staring back at us through plastic, googly eyes,” explained Hod Lipson, James and Sally Scapa Professor of Innovation (Mechanical Engineering) and director of the Creative Machines Lab.

Lipson noted a comparable trend in the grocery store, where he faced restocking robots wearing name badges, and in one case, dressed in a hand-knit cap. “People seemed to be humanizing their robotic colleagues by giving them eyes, an identity, or a name,” he remarked. “This made us wonder, if eyes and clothing work, why not make a robot that has a super-expressive and responsive human face?”

While this sounds straightforward, producing a compelling robotic face has been a complicated task for roboticists. For years, robotic body parts have been comprised of metal or hard plastic, materials that were too stiff to flow and move the way human tissue does. Robotic hardware has been similarly unrefined and challenging to work with, as circuits, sensors, and motors are heavy, power-intensive, and bulky.

EVA can express the six basic emotions of anger, disgust, fear, joy, sadness, and surprise, on top of various more nuanced emotions, by utilizing artificial ‘muscles’ – cables and motors – that pull on specific points on EVA’s face, imitating the movements of the more than 42 tiny muscles attached at various points to the skin and bones of human faces.

“The greatest challenge in creating EVA was designing a system that was compact enough to fit inside the confines of a human skull while still being functional enough to produce a wide range of facial expressions,” Faraj commented.

In order to surmount this difficulty, the group depended on 3D printing to manufacture parts with intricate shapes that integrated effortlessly and efficiently with EVA’s skull. After weeks of tugging cables to make EVA smile, frown, or look upset, the team noticed that EVA’s face could provoke emotional responses from their lab mates. “I was minding my own business one day when EVA suddenly gave me a big, friendly smile,” Lipson noted. “I knew it was purely mechanical, but I found myself reflexively smiling back.”

The team stressed that EVA is a laboratory experiment, and imitation alone is far from the intricate ways that humans communicate via facial expressions. However, such technologies could one day have valuable, real-world functions. For example, robots with the ability to respond to a wide range of human body language would be beneficial in workplaces, hospitals, schools, and homes. “There is a limit to how much we humans can engage emotionally with cloud-based chatbots or disembodied smart-home speakers,” explained Lipson. “Our brains seem to respond well to robots that have some kind of recognizable physical presence.”

“Robots are intertwined in our lives in a growing number of ways, so building trust between humans and machines is increasingly important,” added Chen.

Source: https://www.innovationnewsnetwork.com/

 

 

Teaching robots to make human facial expressions

We use our own and third-party cookies to enable and improve your browsing experience on our website. If you go on surfing, we will consider you accepting its use.