Artistic Style Transfer for Videos and Spherical Images

N/ACitations
Citations of this article
120Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Manually re-drawing an image in a certain artistic style takes a professional artist a long time. Doing this for a video sequence single-handedly is beyond imagination. We present two computational approaches that transfer the style from one image (for example, a painting) to a whole video sequence. In our first approach, we adapt to videos the original image style transfer technique by Gatys et al. based on energy minimization. We introduce new ways of initialization and new loss functions to generate consistent and stable stylized video sequences even in cases with large motion and strong occlusion. Our second approach formulates video stylization as a learning problem. We propose a deep network architecture and training procedures that allow us to stylize arbitrary-length videos in a consistent and stable way, and nearly in real time. We show that the proposed methods clearly outperform simpler baselines both qualitatively and quantitatively. Finally, we propose a way to adapt these approaches also to 360∘ images and videos as they emerge with recent virtual reality hardware.

Cite

CITATION STYLE

APA

Ruder, M., Dosovitskiy, A., & Brox, T. (2018). Artistic Style Transfer for Videos and Spherical Images. International Journal of Computer Vision, 126(11), 1199–1219. https://doi.org/10.1007/s11263-018-1089-z

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free