Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis

    Research output: Contribution to journalArticleAcademicpeer-review

    69 Citations (Scopus)
    1 Downloads (Pure)

    Abstract

    This paper presents a parallel real time framework for emotions and mental states extraction and recognition from video fragments of human movements. In the experimental setup human hands are tracked by evaluation of moving skin-colored objects. The tracking analysis demonstrates that acceleration and frequency characteristics of the traced objects are relevant for classification of the emotional expressiveness of human movements. The outcomes of the emotional and mental states recognition are cross-validated with the analysis of two independent certified movement analysts (CMA’s) who use the Laban movement analysis (LMA) method. We argue that LMA based computer analysis can serve as a common language for expressing and interpreting emotional movements between robots and humans, and in that way it resembles the common coding principle between action and perception by humans and primates that is embodied by the mirror neuron system. The solution is part of a larger project on interaction between a human and a humanoid robot with the aim of training social behavioral skills to autistic children with robots acting in a natural environment.
    Original languageEnglish
    Pages (from-to)1256-1265
    JournalRobotics and Autonomous Systems
    Volume58
    Issue number12
    DOIs
    Publication statusPublished - 2010

    Fingerprint

    Dive into the research topics of 'Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis'. Together they form a unique fingerprint.

    Cite this