Building Floorplan Reconstruction Based on Integer Linear Programming

The reconstruction of the floorplan for a building requires the creation of a two-dimensional floorplan from a 3D model. This task is widely employed in interior design and decoration. In reality, the structures of indoor environments are complex with much clutter and occlusions, making it difficult...

Full description

Saved in:
Bibliographic Details
Published in:Remote sensing (Basel, Switzerland) Vol. 14; no. 18; p. 4675
Main Authors: Wang, Qiting, Zhu, Zunjie, Chen, Ruolin, Xia, Wei, Yan, Chenggang
Format: Journal Article
Language:English
Published: Basel MDPI AG 01.09.2022
Subjects:
ISSN:2072-4292, 2072-4292
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The reconstruction of the floorplan for a building requires the creation of a two-dimensional floorplan from a 3D model. This task is widely employed in interior design and decoration. In reality, the structures of indoor environments are complex with much clutter and occlusions, making it difficult to reconstruct a complete and accurate floorplan. It is well known that a suitable dataset is a key point to drive an effective algorithm, while existing datasets of floorplan reconstruction are synthetic and small. Without reliable accumulations of real datasets, the robustness of methods to real scene reconstruction is weakened. In this paper, we first annotate a large-scale realistic benchmark, which contains RGBD image sequences and 3D models of 80 indoor scenes with more than 10,000 square meters. We also introduce a framework for the floorplan reconstruction with mesh-based point cloud normalization. The loose-Manhattan constraint is performed in our optimization process, and the optimal floorplan is reconstructed via constraint integer programming. The experimental results on public and our own datasets demonstrate that the proposed method outperforms FloorNet and Floor-SP.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:2072-4292
2072-4292
DOI:10.3390/rs14184675