Traditional shape from defocus has been based on modeling the defocusing process through a normalized point spread function (PSF). Here we show that, in the general case, the normalization factor will depend on the depth map, what precludes shape estimation. If the camera is focused at far distances, however, such dependence can be neglected and an unnormalized PSF can be employed. We thus reformulate Pentland's shape from defocus approach using unnormalized gaussians, and prove that, under certain assumptions, such model allows the estimation of a dense depth map from a single input image. Moreover, by using unnormalized Gabor functions as a generalization of the unnormalized-gaussian PSF, we are able to approximate any signal as resulting from a series of local, frequency-dependent defocusing processes, to which the modified Pentland's approach also applies. Such approximation proves suitable for shading images, and has allowed us to obtain good shape-from-shading estimates essentially through a shape-from-defocus approach, without resorting to the reflectance map concept. © 2008 Springer Berlin Heidelberg.
CITATION STYLE
Torreão, J. R. A., & Fernandes, J. L. (2008). Shading through defocus. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5359 LNCS, pp. 501–510). https://doi.org/10.1007/978-3-540-89646-3_49
Mendeley helps you to discover research relevant for your work.