Unrestricted camera motion and the ability to operate over a range of lens parameters are often desirable when using an off-the-shelf camera. Variations in intrinsic and extrinsic parameters induce defocus and pixel motion, both of which relate to scene structure. We propose a depth estimation approach by elegantly coupling the motion and defocus cues. We further advocate a natural extension of our framework for inpainting both depth and image, using the motion cue. Unlike traditional inpainting, our approach also considers defocus blur. This ensures that the image inpainting is coherent with respect to defocus. We use the belief propagation method in our estimation approach, which also handles occlusions and uses the color image segmentation cue. © 2010. The copyright of this document resides with its authors.
CITATION STYLE
Bhavsar, A. V., & Rajagopalan, A. N. (2010). Depth estimation and inpainting with an unconstrained camera. In British Machine Vision Conference, BMVC 2010 - Proceedings. British Machine Vision Association, BMVA. https://doi.org/10.5244/C.24.84
Mendeley helps you to discover research relevant for your work.