HuMAn: Complex activity recognition with multi-modal multi-positional body sensing

Pratool Bharti, Debraj De, Sriram Chellappan, Sajal K. Das

Research output: Contribution to journalArticlepeer-review

81 Scopus citations

Abstract

Current state-of-the-art systems in the literature using wearables are not capable of distinguishing a large number of fine-grained and/or complex human activities, which may appear similar but with vital differences in context, such as lying on floor versus lying on bed versus lying on sofa. This paper fills the gap by proposing a novel system, called HuMAn, that recognizes and classifies complex at-home activities of humans with wearable sensing. Specifically, HuMAn makes such classifications feasible by leveraging selective multi-modal sensor suites from wearable devices, and enhances the richness of sensed information for activity classification by carefully leveraging placement of the wearable devices across multiple positions on the human body. The HuMAn system consists of the following components: (a) a practical feature set extraction method from selected multi-modal sensor suites; and (b) a novel two-level structured classification algorithm that improves accuracy by leveraging sensors in multiple body positions; and (c) improved refinement in classification of complex activities with minimal external infrastructure support (e.g., only a few Bluetooth beacons used for location context). The proposed system is evaluated with 10 users in real home environments. Experimental results demonstrate that the HuMAn system can detect 21 complex at-home activities with high degree of accuracy. For same-user evaluation strategy, the average activity classification accuracy is as high as 95 percent over all of the 21 activities. For the case of 10-fold cross-validation evaluation strategy, the average classification accuracy is 92 percent, and for the case of leave-one-out cross-validation strategy, the average classification accuracy is 75 percent.

Original languageEnglish
Article number8374816
Pages (from-to)857-870
Number of pages14
JournalIEEE Transactions on Mobile Computing
Volume18
Issue number4
DOIs
StatePublished - Apr 1 2019
Externally publishedYes

Funding

The authors thank the anonymous reviewers and the Associate Editor for their constructive feedback and valuable suggestions to help improve the quality of the manuscript. This work was supported in part by the US National Science Foundation (NSF) under Award Nos. IIS-1404673, IIP-1648907, IIP-1540119, IIS-1254117 and II-New-1205695. D. De was a postdoctoral research fellow in computer science at the Missouri University of Science and Technology while this work was done.

FundersFunder number
US National Science Foundation
National Science Foundation

    Keywords

    • Complex activity recognition
    • conditional random fields
    • smart health
    • smartphone multi-modal sensors

    Fingerprint

    Dive into the research topics of 'HuMAn: Complex activity recognition with multi-modal multi-positional body sensing'. Together they form a unique fingerprint.

    Cite this