In this article, we propose a novel multimodal data analytics scheme for human activity recognition. Traditional data analysis schemes for activityrecognition using heterogeneous sensor network setups for eHealth application scenarios are usually a heuristic process, involving underlying domain knowledge. Relying on such explicit knowledge is problematic when aiming to create automatic, unsupervised or semi-supervised monitoring and tracking of different activities, and detection of abnormal events. Experiments on a publicly available OPPORTUNITY activity recognition database from UCI machine learning repository demonstrates the potential of our approach to address next generation unsupervised automatic classification and detection approaches for remote activity recognition for novel, eHealth application scenarios, such as monitoring and tracking of elderly, disabled and those with special needs.
Mendeley helps you to discover research relevant for your work.
CITATION STYLE
Chetty, G., & Yamin, M. (2014). A novel multimodal data analytic scheme for human activity recognition. In IFIP Advances in Information and Communication Technology (Vol. 426, pp. 449–458). Springer New York LLC. https://doi.org/10.1007/978-3-642-55355-4_47