Visual-Guided Robotic Object Grasping Using Dual Neural Network Controllers

22Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

It has been a challenging task for a robotic arm to accurately reach and grasp objects, which has drawn much research attention. This article proposes a robotic hand-eye coordination system by simulating the human behavior pattern to achieve a fast and robust reaching ability. This is achieved by two neural-network-based controllers, including a rough reaching movement controller implemented by a pretrained radial basis function for rough reaching movements, and a correction movement controller built from a specifically designed brain emotional nesting network (BENN) for smooth correction movements. In particular, the proposed BENN is designed with high nonlinear mapping ability, with its adaptive laws derived from the Lyapunov stability theorem; from this, the robust tracking performance and accordingly the stability of the proposed control system are guaranteed by the utilization of the $H^{\infty }$ control approach. The proposed BENN is validated and evaluated by a chaos synchronization simulation, and the overall control system by object grasping tasks through a physical robotic arm in a real-world environment. The experimental results demonstrate the superiority of the proposed control system in reference to those with single neural networks.

Cite

CITATION STYLE

APA

Fang, W., Chao, F., Lin, C. M., Zhou, D., Yang, L., Chang, X., … Shang, C. (2021). Visual-Guided Robotic Object Grasping Using Dual Neural Network Controllers. IEEE Transactions on Industrial Informatics, 17(3), 2282–2291. https://doi.org/10.1109/TII.2020.2995142

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free