Abstract
If robots are to communicate with humans in a successful manner, they will need to be able to take and give turns during conversations. Effective and appropriate turn-taking and turn-yielding actions are crucial in doing so. The present study investigates the objective and subjective performance of four different turn-yielding cues performed by a NAO robot. The results show that an artificial cue, flashing eye-LEDs, lead to significantly shorter response times by the conversational partner than not giving any cue and was experienced as an improvement to the conversation. However, stopping arm movement or head turning cues showed, respectively, no significant difference or even longer response times compared to the baseline condition. Conclusions are that turn-yielding cues can lead to improved conversations, though it depends on the type of cue, and that copying human turn-yielding cues is not necessarily the best option for robots.
Original language | English |
---|---|
Number of pages | 2 |
Publication status | Published - 2015 |
Event | AISB Convention 2015, Society for the Study of Artificial Intelligence and Simulation of Behaviour, 20-22 April 2015, Canterbury, 2015 - University of Kent, Canterbury, United Kingdom Duration: 20 Apr 2015 → 22 Apr 2015 http://www.aisb.org.uk/ |
Conference
Conference | AISB Convention 2015, Society for the Study of Artificial Intelligence and Simulation of Behaviour, 20-22 April 2015, Canterbury, 2015 |
---|---|
Abbreviated title | AISB2015 |
Country/Territory | United Kingdom |
City | Canterbury |
Period | 20/04/15 → 22/04/15 |
Internet address |