Task classification model for visual fixation, exploration, and search

Ayush Kumar, Anjul Tyagi, Michael Burch, Daniel Weiskopf, Klaus Mueller

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

10 Citations (Scopus)

Abstract

Yarbus’ claim to decode the observer’s task from eye movements has received mixed reactions. In this paper, we have supported the hypothesis that it is possible to decode the task. We conducted an exploratory analysis on the dataset by projecting features and data points into a scatter plot to visualize the nuance properties for each task. Following this analysis, we eliminated highly correlated features before training an SVM and Ada Boosting classifier to predict the tasks from this filtered eye movements data. We achieve an accuracy of 95.4% on this task classification problem and hence, support the hypothesis that task classification is possible from a user’s eye movement data.

Original languageEnglish
Title of host publicationProceedings - ETRA 2019
Subtitle of host publication2019 ACM Symposium On Eye Tracking Research and Applications
EditorsStephen N. Spencer
Place of PublicationNew York
PublisherAssociation for Computing Machinery, Inc
Number of pages4
ISBN (Electronic)978-1-4503-6709-7
DOIs
Publication statusPublished - 25 Jun 2019
Event11th ACM Symposium on Eye Tracking Research and Applications, ETRA 2019 - Denver, United States
Duration: 25 Jun 201928 Jun 2019

Conference

Conference11th ACM Symposium on Eye Tracking Research and Applications, ETRA 2019
Country/TerritoryUnited States
CityDenver
Period25/06/1928/06/19

Keywords

  • Classifier
  • Eye movements
  • Task decoding
  • Visual attention
  • Yarbus

Fingerprint

Dive into the research topics of 'Task classification model for visual fixation, exploration, and search'. Together they form a unique fingerprint.

Cite this