Filling gaps of cartographic polylines by using an encoder-decoder model

Geospatial studies must address spatial data quality, especially in data-driven research. An essential concern is how to fill spatial data gaps (missing data), such as for cartographic polylines. Recent advances in deep learning have shown promise in filling holes in images with semantically plausib...

Full description

Saved in:
Bibliographic Details
Published in:International journal of geographical information science : IJGIS Vol. 36; no. 11; pp. 2296 - 2321
Main Authors: Yu, Wenhao, Chen, Yujie
Format: Journal Article
Language:English
Published: Abingdon Taylor & Francis 02.11.2022
Taylor & Francis LLC
Subjects:
ISSN:1365-8816, 1362-3087, 1365-8824
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Geospatial studies must address spatial data quality, especially in data-driven research. An essential concern is how to fill spatial data gaps (missing data), such as for cartographic polylines. Recent advances in deep learning have shown promise in filling holes in images with semantically plausible and context-aware details. In this paper, we propose an effective framework for vector-structured polyline completion using a generative model. The model is trained to generate the contents of missing polylines of different sizes and shapes conditioned on the contexts. Specifically, the generator can compute the content of the entire polyline sample globally and produce a plausible prediction for local gaps. The proposed model was applied to contour data for validation. The experiments generated gaps of random sizes at random locations along with the polyline samples. Qualitative and quantitative evaluations show that our model can fill missing points with high perceptual quality and adaptively handle a range of gaps. In addition to the simulation experiment, two case studies with map vectorization and trajectory filling illustrate the application prospects of our model.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1365-8816
1362-3087
1365-8824
DOI:10.1080/13658816.2022.2055036