Specific Area Style Transfer on Real Time Video

  • et al.
N/ACitations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Since deep learning applications in object recogni-tion, object detection, segmentation, and image generation are needed increasingly, related research has been actively conducted. In this paper, using segmentation and style transfer together, a method of producing desired images in the desired area in real-time video is proposed. Two deep neural networks were used to enable as possible as in real-time with the trade-off relationship between speed and accuracy. Modified BiSe Net for segmentation and Cycle GAN for style transfer were processed on a desktop PC equipped with two RTX-2080-Ti GPU boards. This enables real-time processing over SD video in decent level. We obtained good results in subjective quality to segment Road area in city street video and change into the Grass style at no less than 6(fps).

Cite

CITATION STYLE

APA

Ko, H.-H., Kim, G., & Kim, H. (2021). Specific Area Style Transfer on Real Time Video. International Journal of Innovative Technology and Exploring Engineering, 10(5), 50–56. https://doi.org/10.35940/ijitee.e8689.0310521

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free