In this paper, we propose to incorporate prior knowledge from sign language linguistic models about the motion of the hands within a multiple hypothesis tracking framework. A critical component for automated visual sign language recognition is the tracking of the signer’s hands, especially when faced with frequent and persistent occlusions and complex hand interactions. Hand motion constraints identified by sign language phonological models, such as the hand symmetry condition, are used as part of the data association process. Initial experimental results show the validity of the proposed approach.
CITATION STYLE
Borg, M., & Camilleri, K. P. (2015). Multiple hypothesis tracking with sign language hand motion constraints. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9257, pp. 207–219). Springer Verlag. https://doi.org/10.1007/978-3-319-23117-4_18
Mendeley helps you to discover research relevant for your work.