Using Inferred Gestures from sEMG Signal to Teleoperate a Domestic Robot for the Disabled

3Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

With the lightning speed of technological evolution, several methods have been proposed with the aim of controlling robots and using them to serve humanity. In this work, we present and evaluate a novel learning-based system to control Pepper, the humanoid robot. We leveraged an existing low-cost surface electromyography (sEMG) sensor, that is in the consumer market, Myo armband. To achieve our goal, we created a dataset including 6 hand gestures recorded from 35 intact people by the usage of the Myo Armband device, which has 8 non-intrusive sEMG sensors. Using raw signals extracted from Myo armband, we have been able to train a gated recurrent unit-based network to perform gesture classification. Afterwards, we integrated our system with a live hand gesture recognition application, transmitting the commands to the robot for implementing a live teleoperation method. In this way, we are able to evaluate in real-time the capabilities of our system. According to the experiments, the teleoperation of a Pepper robot achieved an average of 77.5% accuracy during test.

Cite

CITATION STYLE

APA

Nasri, N., Gomez-Donoso, F., Orts-Escolano, S., & Cazorla, M. (2019). Using Inferred Gestures from sEMG Signal to Teleoperate a Domestic Robot for the Disabled. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11507 LNCS, pp. 198–207). Springer Verlag. https://doi.org/10.1007/978-3-030-20518-8_17

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free