In this study, human robot collaboration (HRC) via force myography (FMG) bio-signal was investigated. Interactive hand force was estimated during moving a wooden rod in 3D with a Kuka robot. A baseline FMG-based deep convolutional neural network (FMG-DCNN) model could moderately estimate applied forces during the HRC task. Model performance can be improved with additional training data; however, collection of it was impractical and time-consuming. Available long-term multiple source data (32 feature spaces) during human robot interaction (HRI) with a linear robot collected over a long time period might be useful. Therefore, we explored a cross-domain generalization (CDG) technique that allowed pretraining a model to transfer knowledge between two unrelated source (2D-HRI) and target data (3D-HRC) for the first time. An FMG-based transfer learning with CDG (TL-CDG) model trained with these multiple source domains was examined in estimating applied forces from 16-channel FMG data during interactions with the Kuka robot. Two target scenarios were evaluated: case i ) collaborative task of moving the wooden rod in 3D, and case ii) grasping interactions in 1D. In both cases, few calibration data finetuned the TL-CDG model and improved recognizing out-of-domain target data (case i: text{R}^{2}approx 60 -63%, and case ii: text{R}^{2}approx 79 -87%) compared to the baseline FMG-DCNN model. Hence, cross-domain generalization could be useful in platform-independent FMG-based HRI applications.
CITATION STYLE
Zakia, U., & Menon, C. (2022). Human-Robot Collaboration in 3D via Force Myography Based Interactive Force Estimations Using Cross-Domain Generalization. IEEE Access, 10, 35835–35845. https://doi.org/10.1109/ACCESS.2022.3164103
Mendeley helps you to discover research relevant for your work.