An Approximate Shading Model with Detail Decomposition for Object Relighting

6Citations
Citations of this article
29Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present an object relighting system that allows an artist to select an object from an image and insert it into a target scene. Through simple interactions, the system can adjust illumination on the inserted object so that it appears naturally in the scene. To support image-based relighting, we build object model from the image, and propose a perceptually-inspired approximate shading model for the relighting. It decomposes the shading field into (a) a rough shape term that can be reshaded, (b) a parametric shading detail that encodes missing features from the first term, and (c) a geometric detail term that captures fine-scale material properties. With this decomposition, the shading model combines 3D rendering and image-based composition and allows more flexible compositing than image-based methods. Quantitative evaluation and a set of user studies suggest our method is a promising alternative to existing methods of object insertion.

Cite

CITATION STYLE

APA

Liao, Z., Karsch, K., Zhang, H., & Forsyth, D. (2019). An Approximate Shading Model with Detail Decomposition for Object Relighting. International Journal of Computer Vision, 127(1), 22–37. https://doi.org/10.1007/s11263-018-1090-6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free