Real-time hand gesture recognition based on electromyographic signals and artificial neural networks

25Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we propose a hand gesture recognition model based on superficial electromyographic signals. The model responds in approximately 29.38 ms (real time) with a recognition accuracy of 90.7%. We apply a sliding window approach using a main window and a sub-window. The sub-window is used to observe a segment of the signal seen through the main window. The model is composed of five blocks: data acquisition, preprocessing, feature extraction, classification and postprocessing. For data acquisition, we use the Myo Armband to measure the electromyographic signals. For preprocessing, we rectify, filter, and detect the muscle activity. For feature extraction, we generate a feature vector using the preprocessed signals values and the results from a bag of functions. For classification, we use a feedforward neural network to label every sub-window observation. Finally, for postprocessing we apply a simple majority voting to label the main window observation.

Cite

CITATION STYLE

APA

Motoche, C., & Benalcázar, M. E. (2018). Real-time hand gesture recognition based on electromyographic signals and artificial neural networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11139 LNCS, pp. 352–361). Springer Verlag. https://doi.org/10.1007/978-3-030-01418-6_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free