Image analogies

452Citations
Citations of this article
594Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper describes a new framework for processing images by example, called "image analogies." The framework involves two stages: a design phase, in which a pair of images, with one image purported to be a "filtered" version of the other, is presented as "training data" and an application phase, in which the learned filter is applied to some new target image in order to create an "analogous" filtered result. Image analogies are based on a simple multi-scale autoregression, inspired primarily by recent results in texture synthesis. By choosing different types of source image pairs as input, the framework supports a wide variety of "image filter" effects, including traditional image filters, such as blurring or embossing; improved texture synthesis, in which some textures are synthesized with higher quality than by previous approaches; super-resolution, in which a higher-resolution image is inferred from a low-resolution source; texture transfer, in which images are "texturized" with some arbitrary source texture; artistic filters, in which various drawing and painting styles are synthesized based on scanned real-world examples; and texture-by-numbers, in which realistic scenes, composed of a variety of textures, are created using a simple painting interface. © 2001 ACM.

Cite

CITATION STYLE

APA

Hertzmann, A., Jacobs, C. E., Oliver, N., Curless, B., & Salesin, D. H. (2001). Image analogies. In Proceedings of the 28th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 2001 (pp. 327–340). Association for Computing Machinery. https://doi.org/10.1145/383259.383295

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free