Enriching task models with usability and user experience evaluation data

Regina Bernhaupt, Philippe Palanque, Dimitri Drouet, Celia Martinie

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

4 Citations (Scopus)
3 Downloads (Pure)


Evaluation results focusing on usability and user experience are often difficult to be taken into account during an iterative design process. This is due to the fact that evaluation exploits concrete artefacts (prototype or system) while design and development are based on more abstract descriptions such as task models or software models. As concrete data cannot be represented, evaluation results are just discarded. This paper addresses the problem of discrepancy between abstract view of task models and concrete data produced in evaluations by first, describing the requirements for a task modelling notation: (a) representation of data for each individual participant, (b) representation of aggregated data for one evaluation as well as (c) several evaluations and (d) the need to visualize multi-dimensional data from the evaluation as well as the interactive system gathered during runtime. Second: by showing how the requirements were integrated in a task modelling tool. Using an example from an experimental evaluation possible usages of the tool are demonstrated.

Original languageEnglish
Title of host publicationHuman-Centered Software Engineering - 7th IFIP WG 13.2 International Working Conference, HCSE 2018, Revised Selected Papers
EditorsMarta Kristín Lárusdóttir, Marco Winckler, Kati Kuusinen, Philippe Palanque, Cristian Bogdan
Place of PublicationCham
Number of pages18
ISBN (Electronic)978-3-030-05909-5
ISBN (Print)978-3-030-05908-8
Publication statusPublished - 2019
Event7th International Working Conference on Human-Centered Software Engineering, HCSE 2018 - Sophia Antipolis, France
Duration: 3 Sept 20185 Sept 2018

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11262 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference7th International Working Conference on Human-Centered Software Engineering, HCSE 2018
CitySophia Antipolis


  • Evaluation
  • Formal description
  • Task models
  • Usability
  • User experience
  • User study


Dive into the research topics of 'Enriching task models with usability and user experience evaluation data'. Together they form a unique fingerprint.

Cite this