Robot motion and grasping for blindfold handover

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Autonomous robots in human-robot interaction (HRI) recently are becoming part of human life as the number of service or personal robots increasingly used in our home. In order to fulfill the gap of HRI merit, we would like to propose a system of the autonomous robot motion and grasping creation for assisting the disabled person such as the blind people for handover tasks to help blind people in pick and place tasks. In this paper, we develop the robot motion to receive the object by handing over from the blindfold human to represent the blindness. To determine the target of the object and human hand, we implement the 6DOF pose detection using a point cloud and hand detection using the Single Shot Detection model in Deep learning for planning motion using 9DOF arm robot with hand. We finally experiment and evaluate the tasks from blindfold-robot handover tasks.

Cite

CITATION STYLE

APA

Chumkamon, S., Kawamoto, K., Inthiam, J., Yokkampon, U., & Hayashi, E. (2020). Robot motion and grasping for blindfold handover. In Proceedings of International Conference on Artificial Life and Robotics (Vol. 2020, pp. 255–258). ALife Robotics Corporation Ltd. https://doi.org/10.5954/ICAROB.2020.OS24-4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free