Skip to main content

This Robot Learns From Our Facial Expressions

Nowadays robots are capable of doing just about anything we tell them to, but what about the things we don't explicitly say or command? Thanks to a wireless headband designed by Anna Gruebler and her team at the University of Tsukuba, we may be one step closer to the end of humanity teaching robots how to respond to our emotions.

Developed at the AI Lab of the University of Tsukuba, the wireless headband device works by capturing electromyographic (EMG) signals from the sides of the face, efficiently detecting when the wearer smiles or frowns. Unlike traditional devices that capture facial expressions such as cameras with smile-detecting algorithms, the headband can work with low light, during movement and of course isn't restricted by the lens of a camera.

The current version of the headband is capable of detecting smiles and frowns with a success rate of over 97 percent and was successful in training this Nao humanoid robot in real-time. As seen in the video, the trainer is able to teach the robot to either discard or give her a ball. The robot is hesitant at first, but later responds quicker after acquiring experience and the correct feedback from the teacher.

We're still far from teaching robots to respond to the many facial expressions and emotions of humans, but devices such as this wireless headband could prove to become a very useful and innovative way to interact with robots. Of course we also have to be careful not to frown too much, otherwise these robots might just get the wrong idea about the human race.