Disentangled Face Attribute Editing via Instance-Aware Latent Space Search

14Citations
Citations of this article
38Readers
Mendeley users who have this article in their library.

Abstract

Recent works have shown that a rich set of semantic directions exist in the latent space of Generative Adversarial Networks (GANs), which enables various facial attribute editing applications. However, existing methods may suffer poor attribute variation disentanglement, leading to unwanted change of other attributes when altering the desired one. The semantic directions used by existing methods are at attribute level, which are difficult to model complex attribute correlations, especially in the presence of attribute distribution bias in GAN's training set. In this paper, we propose a novel framework (IALS) that performs Instance-Aware Latent-Space Search to find semantic directions for disentangled attribute editing. The instance information is injected by leveraging the supervision from a set of attribute classifiers evaluated on the input images. We further propose a Disentanglement-Transformation (DT) metric to quantify the attribute transformation and disentanglement efficacy and find the optimal control factor between attribute-level and instance-specific directions based on it. Experimental results on both GAN-generated and real-world images collectively show that our method outperforms state-of-the-art methods proposed recently by a wide margin. Code is available at https://github.com/yxuhan/IALS.

Cite

CITATION STYLE

APA

Han, Y., Yang, J., & Fu, Y. (2021). Disentangled Face Attribute Editing via Instance-Aware Latent Space Search. In IJCAI International Joint Conference on Artificial Intelligence (pp. 715–721). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2021/99

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free