Machine translation of sign language is a complex and challenging problem in computer vision research. In this work, we propose to handle issues such as hands tracking, feature representation and classification for efficient interpretation of sign language from isolated sign videos. Hands tracking is attempted in a sequential format with one hand after the other by nullifying the effects of head movement using serial particle filter. The estimated hand positions in the video sequence are used to extract the hand portions to create a feature covariance matrix. This matrix is a compact representation of the hand features representing a sign. Adaptability of the feature covariance matrix is explored in developing relationships with new signs without creating a new feature matrix for individual signs. The extracted features are then applied to a neural network classifier which is trained with error backpropagation algorithm. Multiple experiments were conducted on a 181 class signs with 50 sentence formations with 5 different signers. Experimental results show the proposed sequential hand tracking is closer to ground truth. The proposed covariance features resulted in a classification accuracy of 89.34% with the neural network classifier.
CITATION STYLE
Praveen Kumar, P., Prasad Reddy, P. V. G. D., & Srinivasa Rao, P. (2018). Sequential particle filter with covariance features classified with artificial neural nets for continuous Indian sign language recognition. International Journal of Engineering and Technology(UAE), 7(1.1 Special Issue 1), 539–547. https://doi.org/10.14419/ijet.v7i1.1.10163
Mendeley helps you to discover research relevant for your work.