Robust normal estimation in unstructured 3D point clouds by selective normal space exploration

We present a fast and practical approach for estimating robust normal vectors in unorganized point clouds. Our proposed technique is robust to noise and outliers and can preserve sharp features in the input model while being significantly faster than the current state-of-the-art alternatives. The ke...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:The Visual computer Ročník 34; číslo 6-8; s. 961 - 971
Hlavní autori: Mura, Claudio, Wyss, Gregory, Pajarola, Renato
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Berlin/Heidelberg Springer Berlin Heidelberg 01.06.2018
Springer Nature B.V
Predmet:
ISSN:0178-2789, 1432-2315
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:We present a fast and practical approach for estimating robust normal vectors in unorganized point clouds. Our proposed technique is robust to noise and outliers and can preserve sharp features in the input model while being significantly faster than the current state-of-the-art alternatives. The key idea to this is a novel strategy for the exploration of the normal space: First, an initial candidate normal vector, optimal under a robust least median norm, is selected from a discrete subregion of this space, chosen conservatively to include the correct normal; then, the final robust normal is computed, using a simple, robust procedure that iteratively refines the candidate normal initially selected. This strategy allows us to reduce the computation time significantly with respect to other methods based on sampling consensus and yet produces very reliable normals even in the presence of noise and outliers as well as along sharp features. The validity of our approach is confirmed by an extensive testing on both synthetic and real-world data and by a comparison against the most relevant state-of-the-art approaches.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-018-1542-6