Stroke is one of the leading causes of long-term disability worldwide, placing huge burdens on individuals and society. Further, automatic human activity recognition is a challenging task that is vital to the future of healthcare and physical therapy. Using a baseline long short-term memory recurrent neural network, this study provides a novel dataset of stretching, upward stretching, flinging motions, hand-to-mouth movements, swiping gestures, and pouring motions for improved model training and testing of stroke-affected patients. A MATLAB application is used to output textual and audible prediction results. A wearable sensor with a triaxial accelerometer is used to collect preprocessed real-time data. The model is trained with features extracted from the actual patient to recognize new actions, and the recognition accuracy provided by multiple datasets is compared based on the same baseline model. When training and testing using the new dataset, the baseline model shows recognition accuracy that is 11% higher than the Activity Daily Living dataset, 22% higher than the Activity Recognition Single Chest-Mounted Accelerometer dataset, and 10% higher than another real-world dataset.
CITATION STYLE
David, A., Ramadoss, R., Ramachandran, A., & Sivapatham, S. (2023). Activity recognition of stroke-affected people using wearable sensor. ETRI Journal, 45(6), 1079–1089. https://doi.org/10.4218/etrij.2022-0242
Mendeley helps you to discover research relevant for your work.