Sign Language Translation using Hand Gesture Detection

  • et al.
N/ACitations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This Paper demonstrate a module on a sign to speech (voice) converter for auto Conversion of American sign language (ASL) to English Speech and text. It minimizes the gap of communication between speech impaired and other humans. It could be used to understand a speech impaired person’s thoughts or views which he communicates with others through its ASL gestures but failing to communicate due to a large communication gap between them. It also work as a translator for person who do not understand the sign language and allows the communication in the natural way of speaking. The proposed module is an interactive application module developed using Python and its Advanced Libraries. This module uses inbuilt camera of the system to get the images and perform analysis on those images to predict the meaning of that gesture and provide the output as text on screen and speech through speaker of the system that makes this module very much cost effective. This module recognizes one handed ASL gestures of alphabets (A-Z) with highly consistent, fairly high precision and accuracy.

Cite

CITATION STYLE

APA

Shrivastava, Prof. (Dr. ) N. … Sharma, A. (2020). Sign Language Translation using Hand Gesture Detection. International Journal of Recent Technology and Engineering (IJRTE), 9(2), 509–512. https://doi.org/10.35940/ijrte.b3620.079220

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free