Splat and Replace: 3D Reconstruction with Repetitive Elements

Gespeichert in:
Bibliographische Detailangaben
Titel: Splat and Replace: 3D Reconstruction with Repetitive Elements
Autoren: Nicolas Violante, Andréas Meuleman, Alban Gauthier, Fredo Durand, Thibault Groueix, George Drettakis
Weitere Verfasser: Reves, Team
Quelle: Proceedings of the Special Interest Group on Computer Graphics and Interactive Techniques Conference Conference Papers. :1-12
Publication Status: Preprint
Verlagsinformationen: ACM, 2025.
Publikationsjahr: 2025
Schlagwörter: FOS: Computer and information sciences, Segmentation, [INFO.INFO-CV] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV], Computer Science - Graphics, Repetitions, Computer Vision and Pattern Recognition (cs.CV), [INFO.INFO-GR] Computer Science [cs]/Graphics [cs.GR], Computer Science - Computer Vision and Pattern Recognition, Matching, 3D Gaussian Splatting, [INFO.INFO-LG] Computer Science [cs]/Machine Learning [cs.LG], [INFO] Computer Science [cs], Graphics (cs.GR)
Beschreibung: We leverage repetitive elements in 3D scenes to improve novel view synthesis. Neural Radiance Fields (NeRF) and 3D Gaussian Splatting (3DGS) have greatly improved novel view synthesis but renderings of unseen and occluded parts remain low-quality if the training views are not exhaustive enough. Our key observation is that our environment is often full of repetitive elements. We propose to leverage those repetitions to improve the reconstruction of low-quality parts of the scene due to poor coverage and occlusions. We propose a method that segments each repeated instance in a 3DGS reconstruction, registers them together, and allows information to be shared among instances. Our method improves the geometry while also accounting for appearance variations across instances. We demonstrate our method on a variety of synthetic and real scenes with typical repetitive elements, leading to a substantial improvement in the quality of novel view synthesis.
SIGGRAPH Conference Papers 2025. Project site: https://repo-sam.inria.fr/nerphys/splat-and-replace/
Publikationsart: Article
Conference object
Dateibeschreibung: application/pdf
DOI: 10.1145/3721238.3730727
DOI: 10.48550/arxiv.2506.06462
Zugangs-URL: http://arxiv.org/abs/2506.06462
Rights: CC BY
Dokumentencode: edsair.doi.dedup.....0309898e4d95c136c491a4ccce181927
Datenbank: OpenAIRE
Beschreibung
Abstract:We leverage repetitive elements in 3D scenes to improve novel view synthesis. Neural Radiance Fields (NeRF) and 3D Gaussian Splatting (3DGS) have greatly improved novel view synthesis but renderings of unseen and occluded parts remain low-quality if the training views are not exhaustive enough. Our key observation is that our environment is often full of repetitive elements. We propose to leverage those repetitions to improve the reconstruction of low-quality parts of the scene due to poor coverage and occlusions. We propose a method that segments each repeated instance in a 3DGS reconstruction, registers them together, and allows information to be shared among instances. Our method improves the geometry while also accounting for appearance variations across instances. We demonstrate our method on a variety of synthetic and real scenes with typical repetitive elements, leading to a substantial improvement in the quality of novel view synthesis.<br />SIGGRAPH Conference Papers 2025. Project site: https://repo-sam.inria.fr/nerphys/splat-and-replace/
DOI:10.1145/3721238.3730727