Abstract
Abstract In an ambience designed to adapt to the user’s affective state, pervasive
technology should be able to decipher unobtrusively his underlying
mood. Great effort has been devoted to automatic punctual emotion recognition
from visual input. Conversely, little has been done to recognize longer-lasting
affective states, such as mood. Taking for granted the effectiveness of emotion
recognition algorithms, we go one step further and propose a model for estimating
the mood of an affective episode from a known sequence of punctual emotions.
To validate our model experimentally, we rely on the human annotations
of the well-established HUMAINE database. Our analysis indicates that we can
approximate fairly accurately the human process of summarizing the emotional
content of a video in a mood estimation. A moving average function with exponential
discount of the past emotions achieves mood prediction accuracy above
60%.
Original language | English |
---|---|
Title of host publication | Proceedings of the 22nd International Conference on User Modeling, Adaptation and Personalization (UMAP), 7-11 July 2014, Aalborg, Denmark |
Place of Publication | Berlin |
Publisher | Springer |
Pages | 122-133 |
ISBN (Print) | 978-3-319-08785-6 |
DOIs | |
Publication status | Published - 2014 |
Event | 22nd International Conference on User Modeling, Adaptation and Personalization (UMAP 2014) - Aalborg, Denmark Duration: 1 Jan 2014 → … Conference number: 22 |
Conference
Conference | 22nd International Conference on User Modeling, Adaptation and Personalization (UMAP 2014) |
---|---|
Abbreviated title | UMAP 2014 |
Country/Territory | Denmark |
City | Aalborg |
Period | 1/01/14 → … |