Assistive Data Glove for Isolated Static Postures Recognition in American Sign Language Using Neural Network

9Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

Sign language recognition is one of the most challenging tasks of today’s era. Most of the researchers working in this domain have focused on different types of implementations for sign recognition. These implementations require the development of smart prototypes for capturing and classifying sign gestures. Keeping in mind the aspects of prototype design, sensor-based, vision-based, and hybrid approach-based prototypes have been designed. The authors in this paper have designed sensor-based assistive gloves to capture signs for the alphabet and digits. These signs are a small but important fraction of the ASL dictionary since they play an essential role in fingerspelling, which is a universal signed linguistic strategy for expressing personal names, technical terms, gaps in the lexicon, and emphasis. A scaled conjugate gradient-based back propagation algorithm is used to train a fully-connected neural network on a self-collected dataset of isolated static postures of digits, alphabetic, and alphanumeric characters. The authors also analyzed the impact of activation functions on the performance of neural networks. Successful implementation of the recognition network produced promising results for this small dataset of static gestures of digits, alphabetic, and alphanumeric characters.

Cite

CITATION STYLE

APA

Amin, M. S., Rizvi, S. T. H., Mazzei, A., & Anselma, L. (2023). Assistive Data Glove for Isolated Static Postures Recognition in American Sign Language Using Neural Network. Electronics (Switzerland), 12(8). https://doi.org/10.3390/electronics12081904

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free