Robust Kalman filtering cooperated Elman neural network learning for vision-sensing-based robotic manipulation with global stability

22Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.

Abstract

In this paper, a global-state-space visual servoing scheme is proposed for uncalibrated model-independent robotic manipulation. The scheme is based on robust Kalman filtering (KF), in conjunction with Elman neural network (ENN) learning techniques. The global map relationship between the vision space and the robotic workspace is learned using an ENN. This learned mapping is shown to be an approximate estimate of the Jacobian in global space. In the testing phase, the desired Jacobian is arrived at using a robust KF to improve the ENN learning result so as to achieve robotic precise convergence of the desired pose. Meanwhile, the ENN weights are updated (re-trained) using a new input-output data pair vector (obtained from the KF cycle) to ensure robot global stability manipulation. Thus, our method, without requiring either camera or model parameters, avoids the corrupted performances caused by camera calibration and modeling errors. To demonstrate the proposed scheme's performance, various simulation and experimental results have been presented using a six-degree-of-freedom robotic manipulator with eye-in-hand configurations. © 2013 by the authors; licensee MDPI, Basel, Switzerland.

Cite

CITATION STYLE

APA

Zhong, X., Zhong, X., & Peng, X. (2013). Robust Kalman filtering cooperated Elman neural network learning for vision-sensing-based robotic manipulation with global stability. Sensors (Switzerland), 13(10), 13464–13486. https://doi.org/10.3390/s131013464

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free