Hand contour tracking using condensation and partitioned sampling

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we present a visual articulated hand contour tracker which is capable of tracking in real-time the contour of an unadorned articulated hand with the palm approximately parallel to the camera's image plane. In our implementation, a B-spline deformable template is used to represent human hand contour, and a 14-dimensions non-linear state space which is divided into 7 parts is used to represent the dynamics of a hand contour. The tracking is performed in grey-scale skin-color image based on particle filter and partitioned sampling. Firstly, a Gaussian model is used to extract the skin pixels. Secondly, particles for each of the 7 parts of the non-linear state space are generated hierarchically based on second-order auto-regressive processes and partitioned sampling, and then each generated particle is weighted by an observation density. Finally, the best complete particle is chosen as the tracking result, and several complete particles are stored to be used in the next frame. The experiments show that our tracker performs well when tracking both rigid movements of the whole hand and non-rigid movements of each finger. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Zhou, D., Wang, Y., & Chen, X. (2008). Hand contour tracking using condensation and partitioned sampling. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5093 LNCS, pp. 343–352). https://doi.org/10.1007/978-3-540-69736-7_37

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free