Segmentation-free, area-based articulated object tracking

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We propose a novel, model-based approach for articulated object detection and pose estimation that does not need any low-level feature extraction or foreground segmentation and thus eliminates this error-prone step. Our approach works directly on the input color image and is based on a new kind of divergence of the color distribution between an object hypothesis and its background. Consequently, we get a color distribution of the target object for free. We further propose a coarse-to-fine and hierarchical algorithm for fast object localization and pose estimation. Our approach works significantly better than segmentation-based approaches in cases where the segmentation is noisy or fails, e.g. scenes with skin-colored backgrounds or bad illumination that distorts the skin color. We also present results by applying our novel approach to markerless hand tracking. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Mohr, D., & Zachmann, G. (2011). Segmentation-free, area-based articulated object tracking. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6938 LNCS, pp. 112–123). https://doi.org/10.1007/978-3-642-24028-7_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free