Image Neural Style Transfer with Global and Local Optimization Fusion

26Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper presents a new image synthesis method for image style transfer. For some common methods, the textures and colors in the style image are sometimes applied inappropriately to the content image, which generates artifacts. In order to improve the results, we propose a novel method based on a new strategy that combines both local and global style losses. On the one hand, a style loss function based on a local approach is used to keep the style details. On the other hand, another style loss function based on global measures is used to capture more global structural information. The results on various images show that the proposed method reduces artifacts while faithfully transferring the style image's characteristics and preserving the structure and color of the content image.

Cite

CITATION STYLE

APA

Zhao, H. H., Rosin, P. L., Lai, Y. K., Lin, M. G., & Liu, Q. Y. (2019). Image Neural Style Transfer with Global and Local Optimization Fusion. IEEE Access, 7, 85573–85580. https://doi.org/10.1109/ACCESS.2019.2922554

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free