This paper addresses the challenges in existing upper limb robotic rehabilitation training for asymmetric bimanual activities of daily living (ADL), crucial for stroke patients' ADL-related functional recovery. Our proposed exoskeleton control framework introduces independent joint control and visual guidance in virtual reality (VR) to facilitate asymmetric bimanual ADL training. Unlike conventional task-space control methods, our approach implements independent joint control into underactuated exoskeletons, offering individualized assistance tailored to patients' joint impairment conditions. The framework utilizes human-demonstrated ADL motions for joint trajectory planning, potentially teaching compensatory techniques with unique joint coordination patterns through therapist demonstrations. To address interjoint coordination challenges in underactuated exoskeletons, VR visual guidance aids patients in self-coordinating unassisted and assisted joints. The proposed framework was evaluated by human experiments with 15 healthy subjects. It demonstrated the effectiveness of the proposed visual guidance, exhibiting a motion period similar to human-demonstrated motion and statistically significant reductions in angle errors at the shoulder (sF/E-A) and wrist (wF/E-A). The robot assistance provided by the control framework was further validated through statistically significant reductions in electromyography (EMG) and angle errors at robot-assisted joints. This proof-of-concept on healthy subjects suggests the potential of our control framework to assist stroke patients in asymmetric bimanual ADL training.
CITATION STYLE
Kwok, T. M., & Yu, H. (2024). Asymmetric Bimanual ADL Training With Underactuated Exoskeleton Using Independent Joint Control and Visual Guidance. IEEE Access, 12, 9277–9291. https://doi.org/10.1109/ACCESS.2024.3352911
Mendeley helps you to discover research relevant for your work.