There are several techniques of data measurement for gesture recognition, with applications ranging from prosthetic or autonomous control to human-computer interfacing. Most of the typical techniques depend on image processing, and might face portability hurdles. This paper discusses a method to classify gestures based on the surface EMG (sEMG) readings, thereby allowing user portability. These sEMG readings acquired from the upper forearm provide a direction towards gesture recognition for Indian Sign Language (ISL) interpretation. An Artificial Neural Network (ANN) based on the Scaled Conjugate Gradient (SCG) assisted learning is used to process the data and classify gestures with an accuracy of 97.5 %. The training involved 120 samples corresponding to four distinct wrist gestures. Additionally, the foundations for user-independent adaptability have been laid in this paper.
CITATION STYLE
Kaginalkar, A., & Agrawal, A. (2015). Towards EMG based gesture recognition for Indian sign language interpretation using artificial neural networks. In Communications in Computer and Information Science (Vol. 528, pp. 718–723). Springer Verlag. https://doi.org/10.1007/978-3-319-21380-4_121
Mendeley helps you to discover research relevant for your work.