Telerehabilitation is considered an alternative to traditional therapeutic rehabilitation process for restoring some human functional movements from long distances by using a spectrum of emerging technologies, such as telecommunication technologies and Kinect-based exergames strategies, among others. Its efficacy and clinical usefulness depend on different tools introduced for monitoring and executing exercise sessions for the user. However, these telerehabilitation systems are limited to act as videogames without a quantitative analysis of the rehabilitation progress, in real-time. In this paper, a novel kinect-based exercises recognition strategy is presented, which allows a proper execution of the rehabilitation routines based on clinical measures of the joints of body, in real-time. For doing that, a skeleton-based human action recognition scheme is introduced using a Kinect sensor. The proposed method is composed of two stages: First one, it allows to identify both defined postures, the initial and final pose, respectively, which are established during an exercise execution. The second one, it compares trajectories of body segments involved in the same exercise. Our proposed method was two-fold validated using both, the posture and exercise recognition strategies, respectively. For our evaluation process, a set of 25 healthy volunteers was used. The obtained results demonstrate that the estimated joint orientations from each body segment are effective to represent clinical measures, allowing to improve the efficiency for monitoring the user’s performance during the execution of defined exercises and reducing the required computational cost in these kind of strategies. Therefore, the proposed method could be used in clinical applications and rehabilitation analysis of impaired persons.
Velasco, F., & Narváez, F. (2020). Automatic Exercise Recognition Based on Kinect Sensor for Telerehabilitation. In Communications in Computer and Information Science (Vol. 1154 CCIS, pp. 312–324). Springer. https://doi.org/10.1007/978-3-030-46785-2_25