Smart manipulation approach for assistant robot

1Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This work deals with a smart visual assisted manipulation algorithm, for an assistant robot. The algorithm is divided in two parts: Object visual position feedback, and whole body motion computation. The object position feedback is based on image analysis to define coordinates in R3 space, and provide a reference to the robot about the object localization. The whole body motion computation deals with hand motion planning, and body motion to improve the reach of the hand, and broaden arm workspace. Consequently, grasping the object, picking it up, and bringing it to a goal position. An image analysis method is proposed, by using triangle similarity, knowing in advance object geometric conditions, and distance from camera to the object. A novel motion modified D-H parameters were used to build the workspace. Simulation and experimental results are discussed in order to validate our proposal.

Cite

CITATION STYLE

APA

Becerra, Y., Leon, J., Orjuela, S., Arbulu, M., Matinez, F., & Martinez, F. (2020). Smart manipulation approach for assistant robot. In Lecture Notes in Electrical Engineering (Vol. 554, pp. 904–913). Springer Verlag. https://doi.org/10.1007/978-3-030-14907-9_87

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free