Hand Gesture Recognition for Deaf and Mute

  • et al.
Citations of this article
Mendeley users who have this article in their library.
Get full text


Communication is the fundamental channel between individuals to speak with each other. In the world of communication gestures and signals is a great deal and a lot of research work has been done in the course of past decades. With the end goal to enhance the recognition rate of systems, numerous specialists have conveyed strategies, for example, HMM, Artificial Neural Networks, and Kinect stage. Effective algorithms for segmentation, classification, pattern matching, and recognition have evolved. Gesture-based communication is generally utilized by people with hearing disabilities to speak with one another helpfully utilizing hand motions. The system uses image processing technology and neural networking for the capturing and conversion of gestures. A number of python packages are used to process and generate the results. The application uses laptop webcam for capturing gestures and recognize gestures shown by the user. The application uses TensorFlow and Keras to generate the model for datasets. The gestures shown by the user are compared with stored gestures and the corresponding output is generated along with speech output. The application thus eliminates the communication barrier between hearing impaired-mute and normal people.




Kumar*, Dr. P., Rawat, Dr. S., … Bhagat, S. (2019). Hand Gesture Recognition for Deaf and Mute. International Journal of Innovative Technology and Exploring Engineering, 9(2), 1238–1242. https://doi.org/10.35940/ijitee.b6294.129219

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free