Activity Recognition (AR) from smartphone sensors has become a hot topic in the mobile computing domain since it can provide services directly to the user (health monitoring, fitness, contextawareness) as well as for third party applications and social network (performance sharing, profiling). Most of the research effort has been focused on direct recognition from accelerometer sensors and few studies have integrated the audio channel in their model despite the fact that it is a sensor that is always available on all kinds of smartphones. In this study, we show that audio features bring an important performance improvement over an accelerometer based approach. Moreover, the study demonstrates the interest of considering the smartphone location for online context-aware AR and the prediction power of audio features for this task. Finally, another contribution of the study is the collected corpus that is made available to the community for AR recognition from audio and accelerometer sensors.
CITATION STYLE
Blachon, D., Coşkun, D., & Portet, F. (2014). On-line context aware physical activity recognition from the accelerometer and audio sensors of smartphones. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 8850, 205–220. https://doi.org/10.1007/978-3-319-14112-1_17
Mendeley helps you to discover research relevant for your work.