EuroSAT: A Novel Dataset and Deep Learning Benchmark for Land Use and Land Cover Classification

In this paper, we present a patch-based land use and land cover classification approach using Sentinel-2 satellite images. The Sentinel-2 satellite images are openly and freely accessible, and are provided in the earth observation program Copernicus. We present a novel dataset, based on these images...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE journal of selected topics in applied earth observations and remote sensing Ročník 12; číslo 7; s. 2217 - 2226
Hlavní autoři: Helber, Patrick, Bischke, Benjamin, Dengel, Andreas, Borth, Damian
Médium: Journal Article
Jazyk:angličtina
Vydáno: Piscataway IEEE 01.07.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:1939-1404, 2151-1535
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:In this paper, we present a patch-based land use and land cover classification approach using Sentinel-2 satellite images. The Sentinel-2 satellite images are openly and freely accessible, and are provided in the earth observation program Copernicus. We present a novel dataset, based on these images that covers 13 spectral bands and is comprised of ten classes with a total of 27 000 labeled and geo-referenced images. Benchmarks are provided for this novel dataset with its spectral bands using state-of-the-art deep convolutional neural networks. An overall classification accuracy of 98.57% was achieved with the proposed novel dataset. The resulting classification system opens a gate toward a number of earth observation applications. We demonstrate how this classification system can be used for detecting land use and land cover changes, and how it can assist in improving geographical maps. The geo-referenced dataset EuroSAT is made publicly available at https://github.com/phelber/eurosat.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1939-1404
2151-1535
DOI:10.1109/JSTARS.2019.2918242