PBIR-NIE: Glossy Object Capture under Non-Distant Lighting

University of California, Irvine1, Adobe Research2
teaser

We propose PBIR-NIE, a physics-based inverse rendering pipeline that optimizes an object’s shape, glossy surface reflectance, and non-distant lighting representation. Our method faithfully recovers the shiny and specular appearance, produces relighting results with high fidelity, and accurately captures geometric details from a rough visual hull initialization.

Abstract

Glossy objects present a significant challenge for 3D reconstruction from multi-view input images under natural lighting. In this paper, we introduce PBIR-NIE, an inverse rendering framework designed to holistically capture the geometry, material attributes, and surrounding illumination of such objects. We propose a novel parallax-aware non-distant environment map as a lightweight and efficient lighting representation, accurately modeling the near-field background of the scene, which is commonly encountered in real-world capture setups. This feature allows our framework to accommodate complex parallax effects beyond the capabilities of standard infinite-distance environment maps. Our method optimizes an underlying signed distance field (SDF) through physics-based differentiable rendering, seamlessly connecting surface gradients between a triangle mesh and the SDF via neural implicit evolution (NIE). To address the intricacies of highly glossy BRDFs in differentiable rendering, we integrate the antithetic sampling algorithm to mitigate variance in the Monte Carlo gradient estimator. Consequently, our framework exhibits robust capabilities in handling glossy object reconstruction, showcasing superior quality in geometry, relighting, and material estimation.

Our Pipeline

pipeline Our pipeline takes a set of multi-view images capturing a glossy object and an initial shape as input. It then reconstructs the scene’s geometry, material properties, and lighting using a physics-based inverse rendering (PBIR) approach. The iterative refinement process includes:
  1. Forward Pass: Rendering an image by employing physics-based differentiable rendering. This involves using an explicit mesh extracted with a non-differentiable Marching Cubes algorithm to represent the neural implicit surface for shape, and material networks for surface properties, while leveraging information from input training views. Additionally, Envmap++ is utilized for enhanced lighting representation, replacing the standard infinite-distance environment map to handle non-distant background illumination.
  2. Backward Pass: Comparing the rendered image to the ground truth and computing gradients with respect to scene parameters. We use neural implicit evolution (NIE) [Mehta et al. 2022] to facilitate the backpropagation of gradients from the extracted mesh to the neural implicit surface, bypassing the non-differentiable extraction step.
  3. Update: Adjusting scene parameters (geometry, material, lighting) via backpropagation to minimize the difference between the rendered and ground truth image

Comparisons with NeRO

We evaluate the quality of material and lighting reconstruction using NeRO’s glossy synthetic dataset. In this experiment, we use the same geometry as NeRO and compare NeRO’s stage II results with ours. Our PBIR-NIE demonstrates superior reconstruction quality on the appearance of glossy objects, capturing detailed environment and sharp highlight more effectively.

In the following, we compare our results with NeRO in terms of novel view synthesis, relighting, and environment map reconstruction. Each image has a slider for you to switch between our results (left) and NeRO's results (right). You can change the scene using the buttons below.

novel 1
novel 2
relit 1
relit 2
envmap

BibTeX

@misc{cai2024pbirnieglossyobjectcapture,
        title={PBIR-NIE: Glossy Object Capture under Non-Distant Lighting}, 
        author={Guangyan Cai and Fujun Luan and Miloš Hašan and Kai Zhang and Sai Bi and Zexiang Xu and Iliyan Georgiev and Shuang Zhao},
        year={2024},
        eprint={2408.06878},
        archivePrefix={arXiv},
        primaryClass={cs.CV},
        url={https://arxiv.org/abs/2408.06878}, 
  }