Translator of Indonesian Sign Language Video using Convolutional Neural Network with Transfer Learning

  • Shania S
  • Farid Naufal M
  • Riandaru Prasetyo V
  • et al.
N/ACitations
Citations of this article
61Readers
Mendeley users who have this article in their library.

Abstract

Sign language is a language used to communicate by utilizing gestures and facial expressions. This study focuses on classification of Bahasa Isyarat Indonesia (BISINDO). There are still many people who have difficulty communicating with the deaf people. This study builds video-based translator system using Convolutional Neural Network (CNN) with transfer learning which is commonly used in computer vision especially in image classification. Transfer learning used in this study are a MobileNetV2, ResNet50V2, and Xception. This study uses 11 different commonly used vocabularies in BISINDO. Predictions will be made in real-time scenario using a webcam. In addition, the system given good results in the experiment with an interaction approach between one pair of deaf and normal people. From all the experiments, it was found that the Xception architectures has the best F1 Score of 98.5%.

Cite

CITATION STYLE

APA

Shania, S., Farid Naufal, M., Riandaru Prasetyo, V., & Bin Azmi, M. S. (2022). Translator of Indonesian Sign Language Video using Convolutional Neural Network with Transfer Learning. Indonesian Journal of Information Systems, 5(1), 17–27. https://doi.org/10.24002/ijis.v5i1.5865

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free