Event-based FMCW-Radar for Full Body Gesture Classification and Sensor Fusion with a Dynamic Vision Sensor

  • Leon Müller

Student thesis: Master

Abstract

Modern radar systems achieve high-resolution thanks to GigaHertz frequency bands, high-performance Analog to Digital Converters (ADC), and a large number of antennas. Nevertheless, high precision comes at the cost of a deluge of data, which, in turn, increases the computational costs and makes radar technology challenging to be applied in edge computing scenarios. To enable the use of radar in edge artificial intelligence applications, there is thus the need to reduce the data rate with little or no impact on accuracy. To this end, we propose a new efficient radar encoding strategy applied to mm-wave radars through event-based neuromorphic approaches. Our event-based encoding method works with FMCW-radar sensors, significantly reducing the data rate, bit-width, and density of the output ADC radar with minimal impact on classification accuracy with convolutional neural networks. Compared to conventional frame-based radar sensors, which have a high data rate, our method only reports the change in the returned chirps of the radar (the ADC signal) as a result of movement in front of the radar. This means that the static background is not captured, and no events are generated where there is no sensed movement. This encoding is achieved with a level-crossing delta processing of the raw ADC data, which produces a binary positive/negative event when crossing a specified threshold. When applied to an 8GHz low-power radar, our event-based processing can achieve a density reduction of 72.76% for a final ADC density of 27.24% without losing essential features and thus minimally impacting the classification accuracy. We also show that further compression with binarization of the ADC data ([-1, 0, 1]) does also not impact classification accuracy either. We demonstrate our approach on a new, first-of-its-kind dataset with an 8GHz low-power FMCW radar synchronized on the chirp level with an event-based camera (DVS). We experimentally observed that with a reduction in bit-width of 87.5% and with 72.76% less dense data, our baseline accuracy is only reduced by around 1.73% for all classes in the dataset, and is increases by around 2\% when taking only a subset of the 5 best performing classes.
We also show that DVS and our event-based FMCW-radar encoding can achieve higher classification accuracy with sensor fusion, showing that both sensors complement each other well.
Date of Award15 Aug 2022
Original languageEnglish
SupervisorFederico Corradi (Supervisor 1), Sander Stuijk (Supervisor 2), Manolis Sifalakis (External coach) & Sherif Eissa (Supervisor 2)

Keywords

  • radar
  • neuromorphic sensing
  • sensory fusion
  • neural networks
  • edge AI

Cite this

'