3D face tracking using appearance registration and robust iterative closest point algorithm

2Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recently, researchers proposed deterministic and statistical appearance-based 3D face tracking methods that can successfully tackle the image variability and drift problems. However, appearance-based methods dedicated to 3D face tracking may suffer from inaccuracies since they are not very sensitive to out-of-plane motion variations. On the other hand, the use of dense 3D facial data provided by a stereo rig or a range sensor can provide very accurate 3D face motions/poses. However, this paradigm requires either an accurate facial feature extraction or a computationally expensive registration technique (e.g., the Iterative Closest Point algorithm). In this paper, we propose a 3D face tracker that is based on appearance registration and on a fast variant of a robust Iterative Closest Point algorithm. The resulting 3D face tracker combines the advantages of both appearance-based trackers and 3D data-based trackers. Experiments on real video data show the feasibility and usefulness of the proposed approach. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Dornaika, F., & Sappa, A. D. (2006). 3D face tracking using appearance registration and robust iterative closest point algorithm. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4263 LNCS, pp. 532–541). Springer Verlag. https://doi.org/10.1007/11902140_57

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free