PRESS/HOLD/RELEASE ultrasonic gestures and low complexity recognition based on TCN

Emad Ibrahim, Min Li, Jose Pineda de Gyvez

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

4 Citations (Scopus)

Abstract

Targeting ultrasound-based gesture recognition, this paper proposes a new universal PRESS/HOLD/RELEASE approach that leverages the diversity of gestures performed on smart devices such as mobile phones and IoT nodes. The new set of gestures are generated by interleaving PRESS/HOLD/RELEASE patterns; abbreviated as P/H/R, with gestures like sweeps between a number of microphones. P/H/R patterns are constructed by a hand as it approaches a top of a microphone to generate a virtual Press. After that, the hand settles for an undefined period of time to generate a virtual Hold and finally departs to generate a virtual Release. The same hand can sweep to a 2 nd microphone and perform another P/H/R. Interleaving the P/H/R patterns expands the number of performed gestures. Assuming an on-board speaker transmitting ultrasonic signals, the detection is performed on Doppler shift readings generated by a hand as it approaches and departs a top of a microphone. The Doppler shift readings are presented in a sequence of down-mixed ultrasonic spectrogram frames. We train a Temporal Convolutional Network (TCN) to classify the P/H/R patterns under different environmental noises. Our experimental results show that such P/H/R patterns at a top of a microphone can be achieved with 96.6% accuracy under different noise conditions. A group of P/H/R based gestures has been tested on commercially off-The-shelf (COTS) Samsung Galaxy S7 Edge. Different P/H/R interleaved gestures (such as sweeps, long taps, etc.) are designed using two microphones and a single speaker while using as low as \sim 5\mathrm{K} parameters and as low as \sim 0.15 Million operations (MOPs) in compute power per inference. The P/H/R interleaved set of gestures are intuitive and hence are easy to learn by end users. This paves its way to be deployed by smartphones and smart speakers for mass production.

Original languageEnglish
Title of host publicationProceedings of SiPS 2019: the IEEE International Workshop on Signal Processing Systems
Place of PublicationPiscataway
PublisherInstitute of Electrical and Electronics Engineers
Pages172-177
Number of pages6
ISBN (Electronic)9781728119274
DOIs
Publication statusPublished - Oct 2019
Event33rd IEEE Workshop on Signal Processing Systems, SiPS 2019 - Nanjing, China
Duration: 20 Oct 201923 Oct 2019
Conference number: 33

Conference

Conference33rd IEEE Workshop on Signal Processing Systems, SiPS 2019
Abbreviated titleSiPS 2019
Country/TerritoryChina
CityNanjing
Period20/10/1923/10/19

Keywords

  • Doppler shift
  • Gesture Recognition
  • HSI
  • TCN
  • Ultrasound

Fingerprint

Dive into the research topics of 'PRESS/HOLD/RELEASE ultrasonic gestures and low complexity recognition based on TCN'. Together they form a unique fingerprint.

Cite this