Thick cloud and cloud shadow removal in multitemporal imagery using progressively spatio-temporal patch group deep learning

Thick cloud and its shadow severely reduce the data usability of optical satellite remote sensing data. Although many approaches have been presented for cloud and cloud shadow removal, most of these approaches are still inadequate in terms of dealing with the following three issues: (1) thick cloud...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:ISPRS journal of photogrammetry and remote sensing Ročník 162; s. 148 - 160
Hlavní autoři: Zhang, Qiang, Yuan, Qiangqiang, Li, Jie, Li, Zhiwei, Shen, Huanfeng, Zhang, Liangpei
Médium: Journal Article
Jazyk:angličtina
Vydáno: Elsevier B.V 01.04.2020
Témata:
ISSN:0924-2716, 1872-8235
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Thick cloud and its shadow severely reduce the data usability of optical satellite remote sensing data. Although many approaches have been presented for cloud and cloud shadow removal, most of these approaches are still inadequate in terms of dealing with the following three issues: (1) thick cloud cover with large-scale areas, (2) all the temporal images included cloud or shadow, and (3) deficient utilization of only single temporal images. A novel spatio-temporal patch group deep learning framework for gap-filling through multiple temporal cloudy images is proposed to overcome these issues. The global-local loss function is presented to optimize the training model through cloud-covered and free regions, considering both the global consistency and local particularity. In addition, weighted aggregation and progressive iteration are utilized for reconstructing the holistic results. A series of simulated and real experiments are then performed to validate the effectiveness of the proposed method. Especially on Sentinel-2 MSI and Landsat-8 OLI with single/multitemporal images, under small/large scale regions, respectively.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0924-2716
1872-8235
DOI:10.1016/j.isprsjprs.2020.02.008