Orientation of the arm used for gesture recognition



Gesture recognition enables a natural extension of the way we currently interact with devices. Commercially available gesture recognition systems are usually pre-trained and offer no option for customization by the user. In order to improve the user experience, it is desirable to allow end users to define their own gestures. This scenario requires learning from just a few training examples if we want to impose only a light training load on the user. To this end, we propose a gesture classifier based on a hierarchical probabilistic modeling approach. In this framework, high-level features that are shared among different gestures can be extracted from a large labeled data set, yielding a prior distribution for gestures. When learning new types of gestures, the learned shared prior reduces the number of required training examples for individual gestures. To test our approach we collected a gesture database using the Myo sensor bracelet, which is worn around the forearm. The Myo measures the orientation of the forearm in quaternions using a nine axis IMU. The dataset contains 17 different types of arm movements from 7 different test subjects (students from the Electrical Engineering department).
Date made available6 Jan 2020
Publisher4TU.Centre for Research Data
Date of data production1 Oct 2016 - 1 May 2017
  • A probabilistic modeling approach to one-shot gesture recognition

    Diepen, A. V., Cox, M. & Vries, B. D., 6 Jul 2018, In: arXiv. 2018, 24 p., 1806.11408v2.

    Research output: Contribution to journalArticleAcademic

    Open Access
    33 Downloads (Pure)
  • An in-situ trainable gesture classifier

    van Diepen, A., Cox, M. G. H. & de Vries, A., 10 Jun 2017, Benelearn 2017: Proceedings of the Twenty-Sixth Benelux Conference on Machine Learning, Technische Universiteit Eindhoven, 9-10 June 2017. Duivesteijn, W., Pechenizkiy, M. & Fletcher, G. H. L. (eds.). p. 66-68

    Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

    Open Access

Cite this