aHead: Considering the head position in a multi-sensory setup of wearables to recognize everyday activities with intelligent sensor fusions

14Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper we examine the feasibility of Human Activity Recognition (HAR) based on head mounted sensors, both as stand-alone sensors and as part of a wearable multi-sensory network. To prove the feasibility of such setting, an interactive online HAR-system has been implemented to enable for multi-sensory activity recognition while making use of a hierarchical sensor fusion. Our system incorporates 3 sensor positions distributed over the body, which are head (smart glasses), wrist (smartwatch), and hip (smartphone). We are able to reliably distinguish 7 daily activities, which are: resting, being active, walking, running, jumping, cycling and office work. The results of our field study with 14 participants clearly indicate that the head position is applicable for HAR. Moreover, we demonstrate an intelligent multi-sensory fusion concept that increases the recognition performance up to 86.13% (recall). Furthermore, we found the head to possess very distinctive movement patterns regarding activities of daily living.

Cite

CITATION STYLE

APA

Haescher, M., Trimpop, J., Matthies, D. J. C., Bieber, G., Urban, B., & Kirste, T. (2015). aHead: Considering the head position in a multi-sensory setup of wearables to recognize everyday activities with intelligent sensor fusions. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9170, pp. 741–752). Springer Verlag. https://doi.org/10.1007/978-3-319-20916-6_68

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free