Image-based rendering and tracking of faces

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we present an image-based method for the tracking and rendering of faces. We use the algorithm in an immersive video conferencing system where multiple participants are placed at a virtual table. This requires viewpoint modification of dynamic objects. Since hair and uncovered areas are difficult to model by pure 3-D geometry-based warping, we add image-based rendering techniques to the system. By interpolating novel views from a 3-D image volume, natural looking results can be achieved. The image-based component is embedded into a geometry-based approach that models temporally changing facial features. Both geometry and image cube information are jointly exploited in facial expression analysis and synthesis. © 2005 IEEE.

Cite

CITATION STYLE

APA

Eisert, P., & Rurainsky, J. (2005). Image-based rendering and tracking of faces. In Proceedings - International Conference on Image Processing, ICIP (Vol. 1, pp. 1037–1040). https://doi.org/10.1109/ICIP.2005.1529931

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free