Interactive exploration and refinement of facial expression using manifold learning

11Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Posing expressive 3D faces is extremely challenging. Typical facial rigs have upwards of 30 controllable parameters, that while anatomically meaningful, are hard to use due to redundancy of expression, unrealistic configurations, and many semantic and stylistic correlations between the parameters. We propose a novel interface for rapid exploration and refinement of static facial expressions, based on a data-driven face manifold of natural expressions. Rapidly explored face configurations are interactively projected onto this manifold of meaningful expressions. These expressions can then be refined using a 2D embedding of nearby faces, both on and off the manifold. Our validation is fourfold: we show expressive face creation using various devices; we verify that our learnt manifold transcends its training face, to expressively control very different faces; we perform a crowd-sourced study to evaluate the quality of manifold face expressions; and we report on a usability study that shows our approach is an effective interactive tool to author facial expression.

Cite

CITATION STYLE

APA

Abdrashitov, R., Chevalier, F., & Singh, K. (2020). Interactive exploration and refinement of facial expression using manifold learning. In UIST 2020 - Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology (pp. 778–790). Association for Computing Machinery, Inc. https://doi.org/10.1145/3379337.3415877

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free