Watertight Scenes from Urban LiDAR and Planar Surfaces

The demand for large geometric models is increasing, especially of urban environments. This has resulted in production of massive point cloud data from images or LiDAR. Visualization and further processing generally require a detailed, yet concise representation of the scene's surfaces. Related...

Full description

Saved in:
Bibliographic Details
Published in:Computer graphics forum Vol. 32; no. 5; pp. 217 - 228
Main Authors: van Kreveld, M., van Lankveld, T., Veltkamp, R. C.
Format: Journal Article
Language:English
Published: Oxford, UK Blackwell Publishing Ltd 01.08.2013
Wiley
Subjects:
ISSN:0167-7055, 1467-8659
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The demand for large geometric models is increasing, especially of urban environments. This has resulted in production of massive point cloud data from images or LiDAR. Visualization and further processing generally require a detailed, yet concise representation of the scene's surfaces. Related work generally either approximates the data with the risk of over‐smoothing, or interpolates the data with excessive detail. Many surfaces in urban scenes can be modeled more concisely by planar approximations. We present a method that combines these polygons into a watertight model. The polygon‐based shape is closed with free‐form meshes based on visibility information. To achieve this, we divide 3‐space into inside and outside volumes by combining a constrained Delaunay tetrahedralization with a graph‐cut. We compare our method with related work on several large urban LiDAR data sets. We construct similar shapes with a third fewer triangles to model the scenes. Additionally, our results are more visually pleasing and closer to a human modeler's description of urban scenes using simple boxes.
Bibliography:ark:/67375/WNG-TWGT6HBM-W
istex:681680EEE5B77A2A372E1598F59140FD0FA3CC28
ArticleID:CGF12188
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.12188