Abstract
Yarbus’ claim to decode the observer’s task from eye movements has received mixed reactions. In this paper, we have supported the hypothesis that it is possible to decode the task. We conducted an exploratory analysis on the dataset by projecting features and data points into a scatter plot to visualize the nuance properties for each task. Following this analysis, we eliminated highly correlated features before training an SVM and Ada Boosting classifier to predict the tasks from this filtered eye movements data. We achieve an accuracy of 95.4% on this task classification problem and hence, support the hypothesis that task classification is possible from a user’s eye movement data.
Original language | English |
---|---|
Title of host publication | Proceedings - ETRA 2019 |
Subtitle of host publication | 2019 ACM Symposium On Eye Tracking Research and Applications |
Editors | Stephen N. Spencer |
Place of Publication | New York |
Publisher | Association for Computing Machinery, Inc |
Number of pages | 4 |
ISBN (Electronic) | 978-1-4503-6709-7 |
DOIs | |
Publication status | Published - 25 Jun 2019 |
Event | 11th ACM Symposium on Eye Tracking Research and Applications, ETRA 2019 - Denver, United States Duration: 25 Jun 2019 → 28 Jun 2019 |
Conference
Conference | 11th ACM Symposium on Eye Tracking Research and Applications, ETRA 2019 |
---|---|
Country/Territory | United States |
City | Denver |
Period | 25/06/19 → 28/06/19 |
Keywords
- Classifier
- Eye movements
- Task decoding
- Visual attention
- Yarbus