BakedSDF: Meshing Neural SDFs for Real-Time View Synthesis

42Citations
Citations of this article
113Readers
Mendeley users who have this article in their library.

Abstract

We present a method for reconstructing high-quality meshes of large unbounded real-world scenes suitable for photorealistic novel view synthesis. We first optimize a hybrid neural volume-surface scene representation designed to have well-behaved level sets that correspond to surfaces in the scene. We then bake this representation into a high-quality triangle mesh, which we equip with a simple and fast view-dependent appearance model based on spherical Gaussians. Finally, we optimize this baked representation to best reproduce the captured viewpoints, resulting in a model that can leverage accelerated polygon rasterization pipelines for real-time view synthesis on commodity hardware. Our approach outperforms previous scene representations for real-time rendering in terms of accuracy, speed, and power consumption, and produces high quality meshes that enable applications such as appearance editing and physical simulation.

Cite

CITATION STYLE

APA

Yariv, L., Hedman, P., Reiser, C., Verbin, D., Srinivasan, P. P., Szeliski, R., … Mildenhall, B. (2023). BakedSDF: Meshing Neural SDFs for Real-Time View Synthesis. In Proceedings - SIGGRAPH 2023 Conference Papers. Association for Computing Machinery, Inc. https://doi.org/10.1145/3588432.3591536

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free