Visuo-Tactile Based Predictive Cross Modal Perception for Object Exploration in Robotics

Anirvan Dutta, Etienne Burdet, Mohsen Kaboli

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

1 Citation (Scopus)

Abstract

Autonomously exploring the unknown physical properties of novel objects such as stiffness, mass, center of mass, friction coefficient, and shape is crucial for autonomous robotic systems operating continuously in unstructured environments. We introduce a novel visuo-Tactile based predictive cross-modal perception framework where initial visual observations (shape) aid in obtaining an initial prior over the object properties (mass). The initial prior improves the efficiency of the object property estimation, which is autonomously inferred via interactive non-prehensile pushing and using a dual filtering approach. The inferred properties are then used to enhance the predictive capability of the cross-modal function efficiently by using a human-inspired 'surprise' formulation. We evaluated our proposed framework in the real-robotic scenario, demonstrating superior performance.

Original languageEnglish
Title of host publication2024 IEEE International Symposium on Robotic and Sensors Environments, ROSE 2024
PublisherInstitute of Electrical and Electronics Engineers
Number of pages7
ISBN (Electronic)979-8-3503-6236-7
DOIs
Publication statusPublished - 17 Jul 2024
Event2024 IEEE International Symposium on Robotic and Sensors Environments, ROSE 2024 - Chemnitz, Germany
Duration: 20 Jun 202421 Jun 2024

Conference

Conference2024 IEEE International Symposium on Robotic and Sensors Environments, ROSE 2024
Country/TerritoryGermany
CityChemnitz
Period20/06/2421/06/24

Fingerprint

Dive into the research topics of 'Visuo-Tactile Based Predictive Cross Modal Perception for Object Exploration in Robotics'. Together they form a unique fingerprint.

Cite this