Human falls are common source of injury among the elderly, because often the elderly person is injured and cannot call for help. In the literature this is addressed by various fall-detection systems, of which most common are the ones that use wearable sensors. This paper describes the winning method developed for the Challenge Up: Multimodal Fall Detection competition. It is a multi-sensor data-fusion machine-learning method that recognizes human activities and falls using 5 wearable inertial sensors: accelerometers and gyroscopes. The method was evaluated on a dataset collected by 12 subjects of which 3 were used as a test-data for the challenge. In order to optimally adapt the method to the 3 test subjects, we performed an unsupervised similarity search—that finds the three most similar users to the three users in the test data. This helped us to tune the method and its parameters to the 3 most similar users as the ones used for the test. The internal evaluation on the 9 users showed that with this optimized configuration the method achieves 98% accuracy. During the final evaluation for the challenge, our method achieved the highest results (82.5% F1-score, and 98% accuracy) and won the competition.
CITATION STYLE
Gjoreski, H., Stankoski, S., Kiprijanovska, I., Nikolovska, A., Mladenovska, N., Trajanoska, M., … Gams, M. (2020). Wearable Sensors Data-Fusion and Machine-Learning Method for Fall Detection and Activity Recognition. In Studies in Systems, Decision and Control (Vol. 273, pp. 81–96). Springer. https://doi.org/10.1007/978-3-030-38748-8_4
Mendeley helps you to discover research relevant for your work.