Adversarially regularized u-net-based gans for facial attribute modification and generation

8Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Modifying and generating facial images with desired attributes are two important and highly related tasks in the field of computer vision. Some current methods can take advantage of their relationship and use a unified model to handle them simultaneously. However, producing high visual quality images on both tasks is still a challenge. To tackle this issue, we propose a novel model called adversarially regularized U-net (ARU-net)-based generative adversarial networks (ARU-GANs). The ARU-net is the major part of the ARU-GAN and is inspired by the design principle of U-net. It uses skip connections to pass different-level features from encoder to decoder, which preserves sufficient attribute-independent details for the modification task. Besides, this U-net-like architecture employs an adversarial regularization term to guide the distribution of latent representation to match the prior distribution, which guarantees to generate meaningful faces from this prior. We also propose a joint training technique for the ARU-GAN, which enables the facial attribute modification and generation tasks to learn together during training. We perform experiments on celebfaces attributes (CelebA) dataset and make visual analysis and quantitative evaluation on both tasks, which demonstrates that our model can successfully produce high visual quality facial images. Also, the results show that learning two tasks jointly can lead to performance improvement compared with learning them individually. At last, we further validate the effectiveness of our method by making an ablation study and experimenting on another dataset.

Cite

CITATION STYLE

APA

Zhang, J., Li, A., Liu, Y., & Wang, M. (2019). Adversarially regularized u-net-based gans for facial attribute modification and generation. IEEE Access, 7, 86453–86462. https://doi.org/10.1109/ACCESS.2019.2926633

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free