Abstract

Though neural radiance fields (NeRF) have demon-strated impressive view synthesis results on objects and small bounded regions of space, they struggle on “un-bounded” scenes, where the camera may point in any di-rection and content may exist at any distance. In this set-ting, existing NeRF-like models often produce blurry or low-resolution renderings (due to the unbalanced detail and scale of nearby and distant objects), are slow to train, and may exhibit artifacts due to the inherent ambiguity of the task of reconstructing a large scene from a small set of images. We present an extension of mip-NeRF (a NeRF variant that addresses sampling and aliasing) that uses a non-linear scene parameterization, online distillation, and a novel distortion-based regularizer to overcome the chal-lenges presented by unbounded scenes. Our model, which we dub “mip-NeRF 360” as we target scenes in which the camera rotates 360 degrees around a point, reduces mean-squared error by 57% compared to mip-NeRF, and is able to produce realistic synthesized views and detailed depth maps for highly intricate, unbounded real-world scenes.

Keywords

Computer scienceComputer visionAliasingArtificial intelligencePoint (geometry)Set (abstract data type)RadianceBounded functionDistortion (music)AmbiguityMathematicsGeometryPhysicsMathematical analysisFilter (signal processing)Optics

Affiliated Institutions

Related Publications

NeRF

We present a method that achieves state-of-the-art results for synthesizing novel views of complex scenes by optimizing an underlying continuous volumetric scene function using ...

2021 Communications of the ACM 4497 citations

Publication Info

Year
2022
Type
article
Citations
1337
Access
Closed

External Links

Social Impact

Social media, news, blog, policy document mentions

Citation Metrics

1337
OpenAlex

Cite This

Jonathan T. Barron, Ben Mildenhall, Dor Verbin et al. (2022). Mip-NeRF 360: Unbounded Anti-Aliased Neural Radiance Fields. 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) . https://doi.org/10.1109/cvpr52688.2022.00539

Identifiers

DOI
10.1109/cvpr52688.2022.00539