Adaptive activity spotting based on event rates

O.D. Amft

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

7 Citations (Scopus)
156 Downloads (Pure)


To date many activity spotting approaches are static: once the system is trained and deployed it does not change anymore. There are substantial shortcomings of this approach, specifically spotting performance is hampered when patterns or sensor noise level changes. In this work an unsupervised sensitivity adaptation mechanism is proposed for activity event spotting based on expected activity event rates. The expected event rate for activity spotting was derived from the generalisation metric used in information retrieval. To illustrate generalisation effects and depict relations of spotting performance and event rate, different event rates were simulated and their precision-recall spotting performance analysed. Subsequently, the sensitivity adaptation concept is presented and evaluated. For this purpose two large datasets from personal healthcare applications were considered to explore benefits and limitations of this adaptation approach: recognition of drinking motions from inertial sensors and chewing strokes from sound. Results showed up to 28% spotting performance increase for event rate adapted operation, confirming performance benefits for sensitivity adaptation. The approach will be most applicable in situations, where estimated event rate statistics show low variance and long monitoring durations allow effective sensitivity adaptations.
Original languageEnglish
Title of host publicationProceedings of the IEEE International Conference on Sensors Networks, Ubiquitous, and Trustworthy Computing, 2010, Newport Beach, CA, USA 7-9 June 2010
Place of PublicationPiscataway
PublisherInstitute of Electrical and Electronics Engineers
ISBN (Print)978-1-4244-7087-7
Publication statusPublished - 2010


Dive into the research topics of 'Adaptive activity spotting based on event rates'. Together they form a unique fingerprint.

Cite this