High-Resolution Neural Texture Synthesis with Long-Range Constraints

5Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The field of texture synthesis has witnessed important progresses over the last years, most notably through the use of convolutional neural networks. However, neural synthesis methods still struggle to reproduce large-scale structures, especially with high-resolution textures. To address this issue, we first introduce a simple multi-resolution framework that efficiently accounts for long-range dependency. Then, we show that additional statistical constraints further improve the reproduction of textures with strong regularity. This can be achieved by constraining both the Gram matrices of a neural network and the power spectrum of the image. Alternatively, one may constrain only the autocorrelation of the features of the network and drop the Gram matrices constraints. In an experimental part, the proposed methods are then extensively tested and compared to alternative approaches, both in an unsupervised way and through a user study. Experiments show the advantage of the multi-scale scheme for high-resolution textures and the advantage of combining it with additional constraints for regular textures.

Cite

CITATION STYLE

APA

Gonthier, N., Gousseau, Y., & Ladjal, S. (2022). High-Resolution Neural Texture Synthesis with Long-Range Constraints. Journal of Mathematical Imaging and Vision, 64(5), 478–492. https://doi.org/10.1007/s10851-022-01078-y

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free