Individuals with hearing and speaking impairment communicate using sign language. The movement of hand, body and expressions of face are the means by which the people, who are unable to hear and speak, can communicate. Bangla sign alphabets are formed with one or two hand movements. There are some features which differentiates the signs. To detect and recognize the signs, analyzing its shape and comparing its features is necessary. This paper aims to propose a model and build a computer systemthat can recognize Bangla Sign Lanugage alphabets and translate them to corresponding Bangla letters by means of deep convolutional neural network (CNN). CNN has been introduced in this model in form of a pre-trained model called "MobileNet" which produced an average accuracy of 95.71% in recognizing 36 Bangla Sign Language alphabets.
CITATION STYLE
Angona, T. M., Shaon, A. S. M. S., Rashad Niloy, K. T., Karim, T., Tasnim, Z., Reza, S. M. S., & Mahbub, T. N. (2020). Automated bangla sign language translation system for alphabets by means of MobileNet. Telkomnika (Telecommunication Computing Electronics and Control), 18(3), 1292–1301. https://doi.org/10.12928/TELKOMNIKA.V18I3.15311
Mendeley helps you to discover research relevant for your work.