Abstract
Gaze is an important nonverbal cue in human - human communication, for example, in communicating direction of attention. Therefore, presumably being able to understand and provide gaze cues is an important aspect in robot's interactive behavior. While there is considerable progress, as regards the design of social gaze cues for robots, there is little that has been done to examine the ability of humans to read and accept help signals from a robot's gaze. In this study, we examine how people perceive gaze cues and head angles directed towards different target positions on a table when human and NAO robot are sitting against each other as in board game scenarios. From the results, we show that when the head pitch angle is higher (24±2) and the depth is less, approximately 20 cm from the robot, participants detected the positions with good accuracy. Unexpectedly, the locations on the left of the robot were detected with lower accuracy. In conclusion, we discuss the implications of this research for design of interaction settings between human and a robot that is intended for social and educational support.
Original language | English |
---|---|
Title of host publication | 4th International Conference on Human-Agent Interaction (HAI 2016), 4-7 October, 2016, Singapore |
Pages | 329-332 |
Number of pages | 4 |
ISBN (Electronic) | 978-1-4503-4508-8 |
DOIs | |
Publication status | Published - 2016 |
Event | 4th International conference on Human Agent Interaction (HAI 2016), 4-7 October 2016, Biopolis, Singapore - The Matrix building , Biopolis, Singapore Duration: 4 Oct 2016 → 7 Oct 2016 http://hai-conference.net/hai2016/ |
Conference
Conference | 4th International conference on Human Agent Interaction (HAI 2016), 4-7 October 2016, Biopolis, Singapore |
---|---|
Abbreviated title | HAI 2016 |
Country | Singapore |
City | Biopolis |
Period | 4/10/16 → 7/10/16 |
Internet address |