A hierarchical approach for associating body-worn sensors to video regions in crowded Mingling scenarios

Laura Cabrera-Quiros (Corresponding author), Hayley Hung

Research output: Contribution to journalArticleAcademicpeer-review

5 Citations (Scopus)

Abstract

We address the complex problem of associating several wearable devices with the spatio-temporal region of their wearers in video during crowded mingling events using only acceleration and proximity. This is a particularly important first step for multisensor behavior analysis using video and wearable technologies, where the privacy of the participants must be maintained. Most state-of-the-art works using these two modalities perform their association manually, which becomes practically unfeasible as the number of people in the scene increases. We proposed an automatic association method based on a hierarchical linear assignment optimization, which exploits the spatial context of the scene. Moreover, we present extensive experiments on matching from 2 to more than 69 acceleration and video streams, showing significant improvements over a random baseline in a real-world crowded mingling scenario. We also show the effectiveness of our method for incomplete or missing streams (up to a certain limit) and analyze the tradeoff between length of the streams and number of participants. Finally, we provide an analysis of failure cases, showing that deep understanding of the social actions within the context of the event is necessary to further improve performance on this intriguing task.

Original languageEnglish
Article number8584113
Pages (from-to)1867-1879
Number of pages13
JournalIEEE Transactions on Multimedia
Volume21
Issue number7
DOIs
Publication statusPublished - 1 Jul 2019

Keywords

  • acceleration
  • association
  • computer vision
  • Mingling
  • wearable sensor

Fingerprint

Dive into the research topics of 'A hierarchical approach for associating body-worn sensors to video regions in crowded Mingling scenarios'. Together they form a unique fingerprint.

Cite this