Shape constraint and multi-feature fusion particle filter for facial feature point tracking

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The traditional active shape model (ASM) and optical flow tracking methods are mainly used in the near frontal face or little changes on the face. However, they may easily fail to work, when there exists a change of face posture, expression or shelter. This paper presents a particle filter facial feature point tracking method that based on color and texture features and a shape constraint model. As the nostrils feature point area usually has non-rigid changes in the whole tracking, we extract the color and texture features of the area as the observation model. Then the rest feature points take it as a reference and build a geometric shape constraint model for tracking in real-time. If the tracking error exceeds the threshold value, we restart ASM searching and update the observation model so that each of the feature points can be tracked accurately. Experimental results demonstrate the effectiveness and accuracy of the proposed method. © Springer International Publishing 2013.

Cite

CITATION STYLE

APA

Zhao, T., Gong, X., Li, T., Li, X., & Xiong, W. (2013). Shape constraint and multi-feature fusion particle filter for facial feature point tracking. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8232 LNCS, pp. 43–50). https://doi.org/10.1007/978-3-319-02961-0_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free