Depth data-driven real-time articulated hand pose recognition

0Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents a fast but robust method to recognize articulated hand pose from single depth images in real-time. We tackle the main challenges in the hand pose recognition, which include the high degree of freedom and self-occlusion of articulated hand motion, using efficient retrieval of a large set of hand pose templates. The normalized orientation templates are used for encoding the depth images containing hand poses, and the locality sensitive hashing is used for finding the nearest neighbors in real time. Our approach does not suffer from the common problems in the conventional tracking approaches such as model initialization and tracking drift, and qualitatively outperforms the existing hand pose estimation techniques.

Cite

CITATION STYLE

APA

Cha, Y. W., Lim, H., Sung, M. H., & Ahn, S. C. (2014). Depth data-driven real-time articulated hand pose recognition. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8888, pp. 492–501). Springer Verlag. https://doi.org/10.1007/978-3-319-14364-4_47

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free