Depth-aware arbitrary style transfer using instance normalization

0Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Style transfer is the process of rendering one image with some content in the style of another image, representing the style. Recent studies of Liu et al. (2017) show that traditional style transfer methods of Gatys et al. (2016) and Johnson et al. (2016) fail to reproduce the depth of the content image, which is critical for human perception. They suggest to preserve the depth map by additional regularizer in the optimized loss function, forcing preservation of the depth map. However these traditional methods are either computationally inefficient or require training a separate neural network for each style. AdaIN method of Huang et al. (2017) allows efficient transferring of arbitrary style without training a separate model but is not able to reproduce the depth map of the content image. We propose an extension to this method, allowing depth map preservation by applying variable stylization strength. Qualitative analysis and results of user evaluation study indicate that the proposed method provides better stylizations, compared to the original AdaIN style transfer method.

Cite

CITATION STYLE

APA

Kitov, V., Kozlovtsev, K., & Mishustina, M. (2020). Depth-aware arbitrary style transfer using instance normalization. In CEUR Workshop Proceedings (Vol. 2744). CEUR-WS. https://doi.org/10.51130/graphicon-2020-2-3-2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free