Multimodal Wearable Sensing for Sport-Related Activity Recognition Using Deep Learning Networks

54Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Wearable sensors using sensor-based Human Activity Recognition (S-HAR) are generally capable of regular simple actions (walking, sitting, or standing), but are indistinguishable from sophisticated activities, such as sports-related activities. Because these involve a more comprehensive, contextual, and fine-grained classification of complex human activities, simplex activity recognition systems are ineffective for growing real-world applications, for example remote rehabilitation observation and sport performance tracking. So, an S-HAR framework for recognizing sport-related activity utilizing multimodal wearable sensors in numerous body positions is proposed in this study. A public dataset named UCI-DSADS was used to investigate the recognition performance of five deep learning networks. According to the experimental results, the BiGRU recognition model surpasses other deep learning networks with a maximum accuracy of 99.62%.

Cite

CITATION STYLE

APA

Mekruksavanich, S., & Jitpattanakul, A. (2022). Multimodal Wearable Sensing for Sport-Related Activity Recognition Using Deep Learning Networks. Journal of Advances in Information Technology, 13(2), 132–138. https://doi.org/10.12720/jait.13.2.132-138

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free