Task Classification Model for Visual Fixation, Exploration, and Search

Ayush Kumar, Anjul Tyagi, Michael Burch, Daniel Weiskopf, Klaus Mueller

Onderzoeksoutput: Bijdrage aan tijdschriftTijdschriftartikelAcademic

Samenvatting

Yarbus' claim to decode the observer's task from eye movements has received mixed reactions. In this paper, we have supported the hypothesis that it is possible to decode the task. We conducted an exploratory analysis on the dataset by projecting features and data points into a scatter plot to visualize the nuance properties for each task. Following this analysis, we eliminated highly correlated features before training an SVM and Ada Boosting classifier to predict the tasks from this filtered eye movements data. We achieve an accuracy of 95.4% on this task classification problem and hence, support the hypothesis that task classification is possible from a user's eye movement data.
Originele taal-2Engels
Artikelnummer1907.12635
TijdschriftarXiv
Volume2019
DOI's
StatusGepubliceerd - 29 jul. 2019

Vingerafdruk

Duik in de onderzoeksthema's van 'Task Classification Model for Visual Fixation, Exploration, and Search'. Samen vormen ze een unieke vingerafdruk.

Citeer dit