Local spatiotemporal features for dynamic texture synthesis

5Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this paper, we study the use of local spatiotemporal patterns in a non-parametric dynamic texture synthesis method. Given a finite sample video of a texture in motion, dynamic texture synthesis may create a new video sequence, perceptually similar to the input, with an enlarged frame size and longer duration. In general, non-parametric techniques select and copy regions fromthe input sample to serve as building blocks by pasting them together one at a time onto the outcome. In order to minimize possible discontinuities between adjacent blocks, the proper representation and selection of such pieces become key issues. In previous synthesis methods, the block description has been based only on the intensities of pixels, ignoring the texture structure and dynamics. Furthermore, a seam optimization between neighboring blocks has been a fundamental step in order to avoid discontinuities. In our synthesis approach, we propose to use local spatiotemporal cues extracted with the local binary pattern from three orthogonal plane (LBP-TOP) operator, which allows us to include in the video characterization the appearance and motion of the dynamic texture. This improved representation leads us to a better fitting and matching between adjacent blocks, and therefore, the spatial similarity, temporal behavior, and continuity of the input can be successfully preserved. Moreover, the proposed method simplifies other approximations since no additional seam optimization is needed to get smooth transitions between video blocks. The experiments show that the use of the LBP-TOP representation outperforms othermethods, without generating visible discontinuities or annoying artifacts. The results are evaluated using a double-stimulus continuous quality scale methodology, which is reproducible and objective. We also introduce results for the use of our method in video completion tasks. Additionally, we hereby present that the proposed technique is easily extendable to achieve the synthesis in both spatial and temporal domains. © 2014 Lizarraga-Morales et al.

Cite

CITATION STYLE

APA

Lizarraga-Morales, R. A., Guo, Y., Zhao, G., Pietikäinen, M., & Sanchez-Yanez, R. E. (2014). Local spatiotemporal features for dynamic texture synthesis. Eurasip Journal on Image and Video Processing, 2014. https://doi.org/10.1186/1687-5281-2014-17

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free