Maneuvers Under Estimation of Human Postures for Autonomous Navigation of Robot KUKA YouBot

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present the successful demonstration of the Autonomous navigation based on maneuvers under certain human positions for an omnidirectional KUKA YouBot robot. The integration of human posture detection and navigation capabilities in the robot was successfully accomplished thanks to the integration of the Robotic Operating System (ROS) and working environments of open source library of computer vision (OpenCV). The robotic operating system allows the implementation of algorithms on real time and simulated platforms, the open source library of computer vision allows the recognition of human posture signals through the use of the Faster R-CNN (regions with convolutional neural networks) deep learning approach, which for its application in OpenCV is translated to SURF (speeded up robust features), which is one of the most used algorithms for extracting points of interest in image recognition. The main contribution of this work is that the Estimation of Human Postures is a promise method in order to provide intelligence in Autonomous Navigation of Robot KUKA YouBot due to the fact that the Robot learn from the human postures and it is capable of perform a desired task during the execution of navigation or any other activity.

Cite

CITATION STYLE

APA

Gordón, C., Barahona, S., Cumbajín, M., & Encalada, P. (2021). Maneuvers Under Estimation of Human Postures for Autonomous Navigation of Robot KUKA YouBot. In Advances in Intelligent Systems and Computing (Vol. 1250 AISC, pp. 335–345). Springer. https://doi.org/10.1007/978-3-030-55180-3_25

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free