Multi-modal sensor fusion for highly accurate vehicle motion state estimation

Vicent Rodrigo Marco (Corresponding author), Jens Kalkkuhl, Jörg Raisch, Wouter Scholte, Henk Nijmeijer, Thomas Seel

Research output: Contribution to journalArticleAcademicpeer-review

3 Citations (Scopus)

Abstract

In the context of autonomous driving in urban environments accurate and reliable information about the vehicle motion is crucial. This article presents a multi-modal sensor fusion scheme that, based on standard production car sensors and an inertial measurement unit, estimates the three-dimensional vehicle velocity and attitude angles (pitch and roll). Moreover, in order to enhance the estimation accuracy, the scheme simultaneously estimates the gyroscope and accelerometer biases. The approach relies on a state-affine representation of a kinematic model with an additional measurement equation based on a single-track model. The sensor fusion scheme is built upon a recently proposed adaptive estimator, which allows a direct consideration of model uncertainties and sensor noise. In order to provide accurate estimates during collision avoidance manoeuvres, a measurement covariance adaptation is introduced, which reduces the influence of the single-track model when its information is superfluous. A validation using experimental data demonstrates the effectiveness of the method during both regular urban drives and collision avoidance manoeuvres.
Original languageEnglish
Article number104409
Number of pages16
JournalControl Engineering Practice
Volume100
DOIs
Publication statusPublished - Jul 2020

Keywords

  • Motion estimation
  • Observability
  • Automotive industry
  • Non-linear systems
  • Inertial sensors
  • Kalman filter
  • Odometry
  • Collision avoidance
  • Autonomous driving
  • Simultaneous state and parameter estimation
  • Systems and control engineering

Fingerprint Dive into the research topics of 'Multi-modal sensor fusion for highly accurate vehicle motion state estimation'. Together they form a unique fingerprint.

Cite this