Glossy objects present a significant challenge for 3D reconstruction from multi-view input images under natural lighting. In this paper, we introduce PBIR-NIE, an inverse rendering framework designed to holistically capture the geometry, material attributes, and surrounding illumination of such objects. We propose a novel parallax-aware non-distant environment map as a lightweight and efficient lighting representation, accurately modeling the near-field background of the scene, which is commonly encountered in real-world capture setups. This feature allows our framework to accommodate complex parallax effects beyond the capabilities of standard infinite-distance environment maps. Our method optimizes an underlying signed distance field (SDF) through physics-based differentiable rendering, seamlessly connecting surface gradients between a triangle mesh and the SDF via neural implicit evolution (NIE). To address the intricacies of highly glossy BRDFs in differentiable rendering, we integrate the antithetic sampling algorithm to mitigate variance in the Monte Carlo gradient estimator. Consequently, our framework exhibits robust capabilities in handling glossy object reconstruction, showcasing superior quality in geometry, relighting, and material estimation.
We evaluate the quality of material and lighting reconstruction using NeRO’s glossy synthetic dataset. In this experiment, we use the same geometry as NeRO and compare NeRO’s stage II results with ours. Our PBIR-NIE demonstrates superior reconstruction quality on the appearance of glossy objects, capturing detailed environment and sharp highlight more effectively.
In the following, we compare our results with NeRO in terms of novel view synthesis, relighting, and environment map reconstruction. Each image has a slider for you to switch between our results (left) and NeRO's results (right). You can change the scene using the buttons below.
@misc{cai2024pbirnieglossyobjectcapture,
title={PBIR-NIE: Glossy Object Capture under Non-Distant Lighting},
author={Guangyan Cai and Fujun Luan and Miloš Hašan and Kai Zhang and Sai Bi and Zexiang Xu and Iliyan Georgiev and Shuang Zhao},
year={2024},
eprint={2408.06878},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2408.06878},
}