Abstract
This article presents a work oriented to assistive robotics, where a scenario is established for a robot to reach a tool in the hand of a user when they have verbally requested it by his name. For this, three convolutional neural networks are trained, one for recognition of a group of tools, which obtained an accuracy of 98% identifying the tools established for the application, that are scalpel, screwdriver and scissors; one for speech recognition, trained with the names of the tools in Spanish language, where its validation accuracy reaches a 97.5% in the recognition of the words; and another for recognition of the user's hand, taking in consideration the classification of 2 gestures: Open and Closed hand, where a 96.25% accuracy was achieved. With those networks, tests in real-time are performed, presenting results in the delivery of each tool with a 100% of accuracy, i.e. the robot was able to identify correctly what the user requested, recognize correctly each tool and deliver the one need when the user opened their hand, taking an average time of 45 seconds in the execution of the application.
Author supplied keywords
Cite
CITATION STYLE
Jiménez-Moreno, R., Pinzón-Arenas, J. O., & Pachón-Suescún, C. G. (2020). Assistant robot through deep learning. International Journal of Electrical and Computer Engineering, 10(1), 1053–1062. https://doi.org/10.11591/ijece.v10i1.pp1053-1062
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.