Abstract
In this paper, we propose a biologically inspired framework for robot learning based on demonstrations. The dynamic movement primitive (DMP), which is motivated by neurobiology and human behavior, is employed to model a robotic motion that is generalizable. However, the DMP method can only be used to handle a single demonstration. To enable the robot to learn from multiple demonstrations, the DMP is combined with the Gaussian mixture model (GMM) to integrate the features of multiple demonstrations, where the conventional GMM is further replaced by the fuzzy GMM (FGMM) to improve the fitting performance. Also, a novel regression algorithm for FGMM is derived to retrieve the nonlinear term of the DMP. Additionally, a neural network-based controller is developed for the robot to track the generated motions. In this network, the cerebellar model articulation controller is employed to compensate for the unknown robot dynamics. The experiments have been performed on a Baxter robot to demonstrate the effectiveness of the proposed methods.
Author supplied keywords
Cite
CITATION STYLE
Yang, C., Chen, C., Wang, N., Ju, Z., Fu, J., & Wang, M. (2019). Biologically Inspired Motion Modeling and Neural Control for Robot Learning from Demonstrations. IEEE Transactions on Cognitive and Developmental Systems, 11(2), 281–291. https://doi.org/10.1109/TCDS.2018.2866477
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.