Learning robot vision for assisted living

W. Yan, E. Torta, D. van der Pol, N. Meins, C. Weber, R.H. Cuijpers, S. Wermter

Research output: Chapter in Book/Report/Conference proceedingChapterAcademic

6 Citations (Scopus)

Abstract

This chapter presents an overview of a typical scenario of Ambient Assisted Living (AAL) in which a robot navigates to a person for conveying information. Indoor robot navigation is a challenging task due to the complexity of real-home environments and the need of online learning abilities to adjust for dynamic conditions. A comparison between systems with different sensor typologies shows that vision-based systems promise to provide good performance and a wide scope of usage at reasonable cost. Moreover, vision-based systems can perform different tasks simultaneously by applying different algorithms to the input data stream thus enhancing the flexibility of the system. The authors introduce the state of the art of several computer vision methods for realizing indoor robotic navigation to a person and human-robot interaction. A case study has been conducted in which a robot, which is part of an AAL system, navigates to a person and interacts with her. The authors evaluate this test case and give an outlook on the potential of learning robot vision in ambient homes.
Original languageEnglish
Title of host publicationRobotic vision
Subtitle of host publicationtechnologies for machine learning and vision applications
EditorsJ. García-Rodríguez, M. Cazorla
Place of PublicationHershey
PublisherIGI Global
Pages257-280
ISBN (Electronic)9781466627031
ISBN (Print)978-1-4666-2672-0
DOIs
Publication statusPublished - 2012

Fingerprint

Dive into the research topics of 'Learning robot vision for assisted living'. Together they form a unique fingerprint.

Cite this