Abstract
Smartwatches are reaching more people every year and have long since found their way into research. When monitoring the health of people with neurological conditions, it is crucial to detect disease progression as early as possible. These can be detected by monitoring everyday activities, such as eating or drinking, using smartwatches. We propose an approach that uses smartwatches in combination with supervised learning algorithms. The similar arm movements during the activities and the thus hardly distinguishable movement patterns are problematic. In this work, we collect data from daily activities using sensor data from smartwatches to classify the similar activities using acoustic recognition. We sample acoustic sensor data at 20 Hz and send it to a web server. A web application enables systematic comparison of logistic regression, fast forest, and support vector machine algorithms to classify four similar motion activities (eating, drinking, talking on the phone, and blowing one's nose). Sensor data from ten subjects while performing the similar arm movements are classified. We build on a previous successful attempt to recognize these activities using motion-based sensors. Now we scrutinize the capabilities of the acoustic sensor and compare its properties and machine learning capabilities with those of the motion-related sensors.
Author supplied keywords
Cite
CITATION STYLE
Staab, S., Bröning, L., Luderschmidt, J., & Martin, L. (2022). Performance Comparison of Motion-Related Sensor Technology and Acoustic Sensor Technology in the Field of Human Health Monitoring. In ACM International Conference Proceeding Series (pp. 198–204). Association for Computing Machinery. https://doi.org/10.1145/3524458.3547220
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.