Multimodal activity recognition based on automatic feature discovery

7Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this article, we propose a novel multimodal data analytics scheme for human activity recognition. Traditional data analysis schemes for activity recognition using heterogeneous sensor network setups for e-Health application scenarios are usually a heuristic process, involving underlying domain knowledge. Relying on such explicit knowledge is problematic when aiming to created automatic, unsupervised monitoring and tracking of different activities, and detection of abnormal events. Experiments on a publicly available OPPORTUNITY activity recognition database from UCI machine learning repository demonstrates the potential of our approach to address next generation unsupervised automatic classification and detection approaches for remote activity recognition for novel, eHealth application scenarios, such as monitoring and tracking of elderly, disabled and those with special needs. © 2014 IEEE.

Cite

CITATION STYLE

APA

Chetty, G., White, M., Singh, M., & Mishra, A. (2014). Multimodal activity recognition based on automatic feature discovery. In 2014 International Conference on Computing for Sustainable Global Development, INDIACom 2014 (pp. 632–637). IEEE Computer Society. https://doi.org/10.1109/IndiaCom.2014.6828039

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free