Robust normal estimation in unstructured 3D point clouds by selective normal space exploration

We present a fast and practical approach for estimating robust normal vectors in unorganized point clouds. Our proposed technique is robust to noise and outliers and can preserve sharp features in the input model while being significantly faster than the current state-of-the-art alternatives. The ke...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:The Visual computer Jg. 34; H. 6-8; S. 961 - 971
Hauptverfasser: Mura, Claudio, Wyss, Gregory, Pajarola, Renato
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2018
Springer Nature B.V
Schlagworte:
ISSN:0178-2789, 1432-2315
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present a fast and practical approach for estimating robust normal vectors in unorganized point clouds. Our proposed technique is robust to noise and outliers and can preserve sharp features in the input model while being significantly faster than the current state-of-the-art alternatives. The key idea to this is a novel strategy for the exploration of the normal space: First, an initial candidate normal vector, optimal under a robust least median norm, is selected from a discrete subregion of this space, chosen conservatively to include the correct normal; then, the final robust normal is computed, using a simple, robust procedure that iteratively refines the candidate normal initially selected. This strategy allows us to reduce the computation time significantly with respect to other methods based on sampling consensus and yet produces very reliable normals even in the presence of noise and outliers as well as along sharp features. The validity of our approach is confirmed by an extensive testing on both synthetic and real-world data and by a comparison against the most relevant state-of-the-art approaches.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-018-1542-6