Imitating human emotions with artificial facial expressions

D.O. Johnson, R.H. Cuijpers, D. Pol, van der

Research output: Contribution to journalArticleAcademicpeer-review

29 Citations (Scopus)
7 Downloads (Pure)

Abstract

Humans convey information about their emotional state through facial expressions. Robots typically cannot show facial expressions like humans do, making it hard for them to imitate emotions. Here we investigate how LED patterns around the eyes of Aldebaran’s Nao robot can be used to imitate human emotions. We performed two experiments. In the first experiment we examined the LED color, intensity, frequency, sharpness, and orientation that humans associate with different emotions. Based on the results, 12 LED patterns were created. The second experiment measured how well humans recognized those LED patterns as the emotions intended by the design. We used a ROC (Receiver Operating Characteristic) graph to determine which of the 12 LED patterns were the best ones for the Nao robot to imitate emotions with. Our technique of using ROC graphs is generally applicable to determining the best of other methods for imitating human emotions (e.g., gestures, speech), as well.
Original languageEnglish
Pages (from-to)503-513
Number of pages11
JournalInternational Journal of Social Robotics
Volume5
Issue number4
DOIs
Publication statusPublished - 2013

Fingerprint Dive into the research topics of 'Imitating human emotions with artificial facial expressions'. Together they form a unique fingerprint.

Cite this