Projects per year
Humans convey information about their emotional state through facial expressions. Robots typically cannot show facial expressions like humans do, making it hard for them to imitate emotions. Here we investigate how LED patterns around the eyes of Aldebaran’s Nao robot can be used to imitate human emotions. We performed two experiments. In the first experiment we examined the LED color, intensity, frequency, sharpness, and orientation that humans associate with different emotions. Based on the results, 12 LED patterns were created. The second experiment measured how well humans recognized those LED patterns as the emotions intended by the design. We used a ROC (Receiver Operating Characteristic) graph to determine which of the 12 LED patterns were the best ones for the Nao robot to imitate emotions with. Our technique of using ROC graphs is generally applicable to determining the best of other methods for imitating human emotions (e.g., gestures, speech), as well.