A synthesis-and-analysis approach to image based lighting

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Given a set of sample images of a scene with variable illumination and the corresponding parameters of the light sources, namely position and/or intensity, we propose an interpolation based approach to model the variation of illumination in images. Once our interpolation based model is constructed from the sample images, we are able to synthesize images under any possible lighting configuration defined in the parametric space. Moreover, given a query image of a scene with a known reference object, our method is able to estimate the lighting parameters of the image. Therefore, our approach allows for both synthesis and analysis of images in different lighting conditions. Our model is ultimately a compact representation of the set of all images with the lighting conditions defined within a parametric space. The interpolation model can generate the rendering of different objects in unknown environments and also perceive unknown environments provided a known object exists in the scene. In this paper, we show robust image synthesis and analysis with two different datasets: an object image dataset with varying lighting intensity and a face image dataset with varying light source position. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Galigekere, V., & Guerra-Filho, G. (2012). A synthesis-and-analysis approach to image based lighting. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7431 LNCS, pp. 292–304). https://doi.org/10.1007/978-3-642-33179-4_29

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free