Beyond cognition and affect : sensing the unconscious

L. Ivonin, H. Chang, M. Diaz, A. Català, W. Chen, G.W.M. Rauterberg

    Research output: Contribution to journalArticleAcademicpeer-review

    6 Citations (Scopus)
    429 Downloads (Pure)


    In the past decade, research on human–computer interaction has embraced psychophysiological user interfaces that enhance awareness of computers about conscious cognitive and affective states of users and increase their adaptive capabilities. Still, human experience is not limited to the levels of cognition and affect but extends further into the realm of universal instincts and innate behaviours that form the collective unconscious. Patterns of instinctual traits shape archetypes that represent images of the unconscious. This study investigated whether seven various archetypal experiences of users lead to recognisable patterns of physiological responses. More specifically, the potential of predicting the archetypal experiences by a computer from physiological data collected with wearable sensors was evaluated. The subjects were stimulated to feel the archetypal experiences and conscious emotions by means of film clips. The physiological data included measurements of cardiovascular and electrodermal activities. Statistical analysis indicated a significant relationship between the archetypes portrayed in the videos and the physiological responses. Data mining methods enabled us to create between-subject prediction models that were capable of classifying four archetypes with an accuracy of up to 57.1%. Further analysis suggested that classification performance could be improved up to 70.3% in the case of seven archetypes by using within-subject models.
    Original languageEnglish
    Pages (from-to)220-238
    Number of pages19
    JournalBehaviour & Information Technology
    Issue number3
    Publication statusPublished - 2015


    Dive into the research topics of 'Beyond cognition and affect : sensing the unconscious'. Together they form a unique fingerprint.

    Cite this