FaceController: Controllable Attribute Editing for Face in the Wild

33Citations
Citations of this article
69Readers
Mendeley users who have this article in their library.

Abstract

Face attribute editing aims to generate faces with one or multiple desired face attributes manipulated while other details are preserved. Unlike prior works such as GAN inversion, which has an expensive reverse mapping process, we propose a simple feed-forward network to generate high-fidelity manipulated faces. By simply employing some existing and easy-obtainable prior information, our method can control, transfer, and edit diverse attributes of faces in the wild. The proposed method can consequently be applied to various applications such as face swapping, face relighting, and makeup transfer. In our method, we decouple identity, expression, pose, and illumination using 3D priors; separate texture and colors by using region-wise style codes. All the information is embedded into adversarial learning by our identity-style normalization module. Disentanglement losses are proposed to enhance the generator to extract information independently from each attribute. Comprehensive quantitative and qualitative evaluations have been conducted. In a single framework, our method achieves the best or competitive scores on a variety of face applications.

Cite

CITATION STYLE

APA

Xu, Z., Yu, X., Hong, Z., Zhu, Z., Han, J., Liu, J., … Bai, X. (2021). FaceController: Controllable Attribute Editing for Face in the Wild. In 35th AAAI Conference on Artificial Intelligence, AAAI 2021 (Vol. 4A, pp. 3083–3091). Association for the Advancement of Artificial Intelligence. https://doi.org/10.1609/aaai.v35i4.16417

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free