Abstract
The amount of deaf and mute individuals on the earth is rising at an alarming rate. Bangladesh has about 2.6 million people who are unable to interact with the community using language. Hearing-impaired citizens in Bangladesh use Bangladeshi sign language (BSL) as a means of communication. In this article, we propose a new method for Bengali sign language recognition based on deep convolutional neural networks. Our framework employs convolutional neural networks (CNN) to learn from the images in our dataset and interpret hand signs from input images. Checking their collections of ten indications (we used ten sets of images with 31 distinct signs) for a total of 310 images. The proposed system takes snapshots from a video by using a webcam with applying a computer vision-based approach. After that, it compares those photos to a previously trained dataset generated with CNN and displays the Bengali numbers (O-ϧ). After estimating the model on our dataset, we obtained an overall accuracy of 99.8%. We want to strengthen things as far as we can to make silent contact with the majority of society as simple as probable.
Author supplied keywords
Cite
CITATION STYLE
Shamrat, F. M. J. M., Chakraborty, S., Billah, M. M., Kabir, M., Shadin, N. S., & Sanjana, S. (2021). Bangla numerical sign language recognition using convolutional neural networks. Indonesian Journal of Electrical Engineering and Computer Science, 23(1), 405–413. https://doi.org/10.11591/ijeecs.v23.i1.pp405-413
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.