A machine learning based approach for gesture recognition from inertial measurements

Giuseppe Belgioioso, Angelo Cenedese, Giuseppe Ilario Cirillo, Francesco Fraccaroli, Gian Antonio Susto

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

13 Citations (Scopus)

Abstract

The interaction based on gestures has become a prominent approach to interact with electronic devices. In this paper a Machine Learning (ML) based approach to gesture recognition (GR) is illustrated; the proposed tool is freestanding from user, device and device orientation. The tool has been tested on a heterogeneous dataset representative of a typical application of gesture recognition. In the present work two novel ML algorithms based on Sparse Bayesian Learning are tested versus other classification approaches already employed in literature (Support Vector Machine, Relevance Vector Machine, k-Nearest Neighbor, Discriminant Analysis). A second element of novelty is represented by a Principal Component Analysisbased approach, called Pre-PCA, that is shown to enhance gesture recognition with heterogeneous working conditions. Feature extraction techniques are also investigated: a Principal Component Analysis based approach is compared to Frame-Based Description methods.

Original languageEnglish
Title of host publication53rd IEEE Conference on Decision and Control (CDC2014)
Pages4899-4904
Number of pages6
DOIs
Publication statusPublished - 1 Jan 2014
Event53rd IEEE Conference on Decision and Control (CDC2014) - "J.W. Marriott Hotel", Los Angeles, United States
Duration: 15 Dec 201417 Dec 2014
Conference number: 53
http://cdc2014.ieeecss.org/

Conference

Conference53rd IEEE Conference on Decision and Control (CDC2014)
Abbreviated titleCDC2014
Country/TerritoryUnited States
CityLos Angeles
Period15/12/1417/12/14
Internet address

Fingerprint

Dive into the research topics of 'A machine learning based approach for gesture recognition from inertial measurements'. Together they form a unique fingerprint.

Cite this