Head pose estimation for a domestic robot

D. Pol, van der, R.H. Cuijpers, J.F. Juola

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

8 Citations (Scopus)

Abstract

Gaze direction is an important communicative cue. In order to use this cue for human-robot interaction, software needs to be developed that enables the estimation of head pose. We began by designing an application that is able to make a good estimate of the head pose, and, contrary to earlier head pose estimation approaches, that works for non-optimal lighting conditions. Initial results show that our approach using multiple networks trained with differing datasets, gives a good estimate of head pose, and it works well in poor lighting conditions and with low-resolution images. We validated our head pose estimation method using a custom built database of images of human heads. The actual head poses were measured using a trakStar (Ascension Technologies) six-degrees-of-freedom sensor. The head pose estimation algorithm allows us to assess a person’s focus of attention, which allows robots to react in a timely fashion to dynamic human communicative cues.
Original languageEnglish
Title of host publicationProceedings of the 6th International Conference on Human-Robot Interaction (HRI 2011), March 6-9, 2011
EditorsA. Billard, P. Kahn, J.A, Adams, G. Trafton
Place of PublicationNew York
PublisherAssociation for Computing Machinery, Inc
Pages277-278
ISBN (Print)978-1-4503-0561-7
DOIs
Publication statusPublished - 2011
Event6th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2011 - Lausanne, Switzerland
Duration: 6 Mar 20119 Mar 2011
Conference number: 6

Conference

Conference6th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2011
Abbreviated titleHRI 2016
Country/TerritorySwitzerland
CityLausanne
Period6/03/119/03/11

Fingerprint

Dive into the research topics of 'Head pose estimation for a domestic robot'. Together they form a unique fingerprint.

Cite this