Improving deep interactive evolution with a style-based generator for artistic expression and creative exploration

7Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

Deep interactive evolution (DeepIE) combines the capacity of interactive evolutionary computation (IEC) to capture a user’s preference with the domain-specific robustness of a trained generative adversarial network (GAN) generator, allowing the user to control the GAN output through evolutionary exploration of the latent space. However, the traditional GAN latent space presents feature entanglement, which limits the practicability of possible applications of DeepIE. In this paper, we implement DeepIE within a style-based generator from a StyleGAN model trained on the WikiArt dataset and propose StyleIE, a variation of DeepIE that takes advantage of the secondary disentangled latent space in the style-based generator. We performed two AB/BA crossover user tests that compared the performance of DeepIE against StyleIE for art generation. Self-rated evaluations of the performance were collected through a questionnaire. Findings from the tests suggest that StyleIE and DeepIE perform equally in tasks with open-ended goals with relaxed constraints, but StyleIE performs better in close-ended and more constrained tasks.

Cite

CITATION STYLE

APA

Tejeda-Ocampo, C., López-Cuevas, A., & Terashima-Marin, H. (2021). Improving deep interactive evolution with a style-based generator for artistic expression and creative exploration. Entropy, 23(1), 1–28. https://doi.org/10.3390/e23010011

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free