Filling gaps of cartographic polylines by using an encoder-decoder model

Geospatial studies must address spatial data quality, especially in data-driven research. An essential concern is how to fill spatial data gaps (missing data), such as for cartographic polylines. Recent advances in deep learning have shown promise in filling holes in images with semantically plausib...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:International journal of geographical information science : IJGIS Ročník 36; číslo 11; s. 2296 - 2321
Hlavní autori: Yu, Wenhao, Chen, Yujie
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Abingdon Taylor & Francis 02.11.2022
Taylor & Francis LLC
Predmet:
ISSN:1365-8816, 1362-3087, 1365-8824
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Geospatial studies must address spatial data quality, especially in data-driven research. An essential concern is how to fill spatial data gaps (missing data), such as for cartographic polylines. Recent advances in deep learning have shown promise in filling holes in images with semantically plausible and context-aware details. In this paper, we propose an effective framework for vector-structured polyline completion using a generative model. The model is trained to generate the contents of missing polylines of different sizes and shapes conditioned on the contexts. Specifically, the generator can compute the content of the entire polyline sample globally and produce a plausible prediction for local gaps. The proposed model was applied to contour data for validation. The experiments generated gaps of random sizes at random locations along with the polyline samples. Qualitative and quantitative evaluations show that our model can fill missing points with high perceptual quality and adaptively handle a range of gaps. In addition to the simulation experiment, two case studies with map vectorization and trajectory filling illustrate the application prospects of our model.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1365-8816
1362-3087
1365-8824
DOI:10.1080/13658816.2022.2055036