In this article, we propose a novel multimodal data analytics scheme for human activity recognition. Traditional data analysis schemes for activity recognition using heterogeneous sensor network setups for e-Health application scenarios are usually a heuristic process, involving underlying domain knowledge. Relying on such explicit knowledge is problematic when aiming to created automatic, unsupervised monitoring and tracking of different activities, and detection of abnormal events. Experiments on a publicly available OPPORTUNITY activity recognition database from UCI machine learning repository demonstrates the potential of our approach to address next generation unsupervised automatic classification and detection approaches for remote activity recognition for novel, eHealth application scenarios, such as monitoring and tracking of elderly, disabled and those with special needs. © 2014 IEEE.
CITATION STYLE
Chetty, G., White, M., Singh, M., & Mishra, A. (2014). Multimodal activity recognition based on automatic feature discovery. In 2014 International Conference on Computing for Sustainable Global Development, INDIACom 2014 (pp. 632–637). IEEE Computer Society. https://doi.org/10.1109/IndiaCom.2014.6828039
Mendeley helps you to discover research relevant for your work.