Real-time facial feature tracking on a mobile device

30Citations
Citations of this article
66Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents an implementation of the Active Appearance Model that is able to track a face on a mobile device in real-time. We achieve this performance by discarding an explicit texture model, using fixed-point arithmetic for much of the computation, applying a sequence of models with increasing complexity, and exploiting a sparse basis projection via Haar-like features. Our results show that the Haar-like feature basis achieves similar performance to more traditional approaches while being more suitable for a mobile device. Finally, we discuss mobile applications of the system such as face verification, teleconferencing and human-computer interaction. © 2011 Springer Science+Business Media, LLC.

Cite

CITATION STYLE

APA

Tresadern, P. A., Ionita, M. C., & Cootes, T. F. (2012). Real-time facial feature tracking on a mobile device. International Journal of Computer Vision, 96(3), 280–289. https://doi.org/10.1007/s11263-011-0464-9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free