Turn-yielding cues in robot-human conversation

J.A. van Schendel, R.H. Cuijpers

Research output: Contribution to conferencePaperAcademic

101 Downloads (Pure)

Abstract

If robots are to communicate with humans in a successful manner, they will need to be able to take and give turns during conversations. Effective and appropriate turn-taking and turn-yielding actions are crucial in doing so. The present study investigates the objective and subjective performance of four different turn-yielding cues performed by a NAO robot. The results show that an artificial cue, flashing eye-LEDs, lead to significantly shorter response times by the conversational partner than not giving any cue and was experienced as an improvement to the conversation. However, stopping arm movement or head turning cues showed, respectively, no significant difference or even longer response times compared to the baseline condition. Conclusions are that turn-yielding cues can lead to improved conversations, though it depends on the type of cue, and that copying human turn-yielding cues is not necessarily the best option for robots.

Original languageEnglish
Number of pages2
Publication statusPublished - 2015
EventAISB Convention 2015, Society for the Study of Artificial Intelligence and Simulation of Behaviour, 20-22 April 2015, Canterbury, 2015 - University of Kent, Canterbury, United Kingdom
Duration: 20 Apr 201522 Apr 2015
http://www.aisb.org.uk/

Conference

ConferenceAISB Convention 2015, Society for the Study of Artificial Intelligence and Simulation of Behaviour, 20-22 April 2015, Canterbury, 2015
Abbreviated titleAISB2015
Country/TerritoryUnited Kingdom
CityCanterbury
Period20/04/1522/04/15
Internet address

Fingerprint

Dive into the research topics of 'Turn-yielding cues in robot-human conversation'. Together they form a unique fingerprint.

Cite this