PSCSC-Net: A Deep Coupled Convolutional Sparse Coding Network for Pansharpening

Given a low-resolution multispectral (MS) image and a high-resolution panchromatic image, the task of pansharpening is to generate a high-resolution MS image. Deep learning (DL)-based methods receive extensive attention recently. Different from the existing DL-based methods, this article proposes a...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on geoscience and remote sensing Ročník 60; s. 1 - 16
Hlavní autor: Yin, Haitao
Médium: Journal Article
Jazyk:angličtina
Vydáno: New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:0196-2892, 1558-0644
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Given a low-resolution multispectral (MS) image and a high-resolution panchromatic image, the task of pansharpening is to generate a high-resolution MS image. Deep learning (DL)-based methods receive extensive attention recently. Different from the existing DL-based methods, this article proposes a novel deep neural network for pansharpening inspired by the learned iterative soft thresholding algorithm. First, a coupled convolutional sparse coding-based pansharpening (PSCSC) model and related traditional optimization algorithm are proposed. Then, following the procedures of traditional algorithm for solving PSCSC, an interpretable end-to-end deep pansharpening network is developed using a deep unfolding strategy. The designed deep architecture can also be understood in the view of details injection (DI)-based scheme. This work offers a solution that integrates the DL-, DI-, and variational optimization-based schemes into a framework. The experimental results on the reduced- and full-scale datasets demonstrate that the proposed deep pansharpening network outperforms popular traditional methods and some current DL-based methods.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3088313