Task classification model for visual fixation, exploration, and search

Ayush Kumar, Anjul Tyagi, Michael Burch, Daniel Weiskopf, Klaus Mueller

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)

Abstract

Yarbus’ claim to decode the observer’s task from eye movements has received mixed reactions. In this paper, we have supported the hypothesis that it is possible to decode the task. We conducted an exploratory analysis on the dataset by projecting features and data points into a scatter plot to visualize the nuance properties for each task. Following this analysis, we eliminated highly correlated features before training an SVM and Ada Boosting classifier to predict the tasks from this filtered eye movements data. We achieve an accuracy of 95.4% on this task classification problem and hence, support the hypothesis that task classification is possible from a user’s eye movement data.

Original languageEnglish
Title of host publicationProceedings - ETRA 2019
Subtitle of host publication2019 ACM Symposium On Eye Tracking Research and Applications
EditorsStephen N. Spencer
Place of PublicationNew York
PublisherAssociation for Computing Machinery, Inc
Number of pages4
ISBN (Electronic)978-1-4503-6709-7
DOIs
Publication statusPublished - 25 Jun 2019
Event11th ACM Symposium on Eye Tracking Research and Applications, ETRA 2019 - Denver, United States
Duration: 25 Jun 201928 Jun 2019

Conference

Conference11th ACM Symposium on Eye Tracking Research and Applications, ETRA 2019
CountryUnited States
CityDenver
Period25/06/1928/06/19

    Fingerprint

Keywords

  • Classifier
  • Eye movements
  • Task decoding
  • Visual attention
  • Yarbus

Cite this

Kumar, A., Tyagi, A., Burch, M., Weiskopf, D., & Mueller, K. (2019). Task classification model for visual fixation, exploration, and search. In S. N. Spencer (Ed.), Proceedings - ETRA 2019: 2019 ACM Symposium On Eye Tracking Research and Applications [65] New York: Association for Computing Machinery, Inc. https://doi.org/10.1145/3314111.3323073