Recognizing and Interpreting Sign Language Gesture for Human Robot Interaction

  • Singh S
  • Jain A
  • Kumar D
N/ACitations
Citations of this article
18Readers
Mendeley users who have this article in their library.

Abstract

Visual interpretation of sign language gesture can be useful in accomplishing natural human robot interaction. This paper describes a sign language gesture based recognition, interpreting and imitation learning system using Indian Sign Language for performing Human Robot Interaction in real time. It permits us to construct a convenient sign language gesture based communication with humanoid robot. The classification, recognition, learning, interpretation process is carried out by extracting the features from Indian sign language (ISL) gestures. Chain code and fisher score is considered as a feature vector for classification and recognition process. It is to be done by the two statistical approaches namely known as Hidden Markov Model (HMM) technique and feed forward back propagation neural network (FNN) in order to achieve satisfactory recognition accuracy. The sensitivity, specificity and accuracy were found to be equal 98.60%, 97.64% and 97.52% respectively. It can be concluded that FNN gives fast and accurate recognition and it works as promising tool for recognition and interpretation of sign language gesture for human computer interaction. The overall accuracy of recognition and interpretation of the proposed system is 95.

Cite

CITATION STYLE

APA

Singh, S., Jain, A., & Kumar, D. (2012). Recognizing and Interpreting Sign Language Gesture for Human Robot Interaction. International Journal of Computer Applications, 52(11), 24–31. https://doi.org/10.5120/8247-1758

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free