Abstract
Human-machine interaction requires the ability to analyze and discern human faces. Due to the nature of the 3D to 2D projection, the recognition of human faces from 2D images, in presence of pose and illumination variations, is intrinsically an ill-posed problem. The direct measurement of the shape for the face surface is now a feasible solution to overcome this problem and make it well-posed. This paper proposes a completely automatic algorithm for 3D face registration and matching based on the extraction of stable 3D facial features characterizing the face and the subsequent construction of a signature manifold. The facial features are extracted by performing a continuous-to-discrete scale-space analysis. Registration is driven from the matching of triplets of feature points and the registration error is computed as shape matching score. A major advantage of the proposed method is that no data pre-processing is required. Despite of the high dimensionality of the data (sets of 3D points, possibly with the associate texture), the signature and hence the template generated is very small. Therefore, the management of the biometric data associated to the user data, not only is very robust to environmental changes, but it is also very compact. The method has been tested against the Bosphorus 3D face database and the performances compared to the ICP baseline algorithm. Even in presence of noise in the data, the algorithm proved to be very robust and reported identification performances in line with the current state of the art. © 2011 Springer-Verlag.
Author supplied keywords
Cite
CITATION STYLE
Cadoni, M., Grosso, E., Lagorio, A., & Tistarelli, M. (2011). Interpreting 3D faces for augmented human-computer interaction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6766 LNCS, pp. 535–544). https://doi.org/10.1007/978-3-642-21663-3_58
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.