Neural apparent BRDF fields for multiview photometric stereo

2Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We propose to tackle the multiview photometric stereo problem using an extension of Neural Radiance Fields (NeRFs), conditioned on light source direction. The geometric part of our neural representation predicts surface normal direction, allowing us to reason about local surface reflectance. The appearance part of our neural representation is decomposed into a neural bidirectional reflectance function (BRDF), learnt as part of the fitting process, and a shadow prediction network (conditioned on light source direction) allowing us to model the apparent BRDF. This balance of learnt components with inductive biases based on physical image formation models allows us to extrapolate far from the light source and viewer directions observed during training. We demonstrate our approach on a multiview photometric stereo benchmark and show that competitive performance can be obtained with the neural density representation of a NeRF.

Cite

CITATION STYLE

APA

Asthana, M., Smith, W., & Huber, P. (2022). Neural apparent BRDF fields for multiview photometric stereo. In Proceedings - CVMP 2022: 19th ACM SIGGRAPH European Conference on Visual Media Production. Association for Computing Machinery, Inc. https://doi.org/10.1145/3565516.3565517

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free