Robust arm and hand tracking by unsupervised context learning

8Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.

Abstract

Hand tracking in video is an increasingly popular research field due to the rise of novel human-computer interaction methods. However, robust and real-time hand tracking in unconstrained environments remains a challenging task due to the high number of degrees of freedom and the non-rigid character of the human hand. In this paper, we propose an unsupervised method to automatically learn the context in which a hand is embedded. This context includes the arm and any other object that coherently moves along with the hand. We introduce two novel methods to incorporate this context information into a probabilistic tracking framework, and introduce a simple yet effective solution to estimate the position of the arm. Finally, we show that our method greatly increases robustness against occlusion and cluttered background, without degrading tracking performance if no contextual information is available. The proposed real-time algorithm is shown to outperform the current state-of-the-art by evaluating it on three publicly available video datasets. Furthermore, a novel dataset is created and made publicly available for the research community. © 2014 by the authors; licensee MDPI, Basel, Switzerland.

Cite

CITATION STYLE

APA

Spruyt, V., Ledda, A., & Philips, W. (2014). Robust arm and hand tracking by unsupervised context learning. Sensors (Switzerland), 14(7), 12023–12058. https://doi.org/10.3390/s140712023

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free