PSCSC-Net: A Deep Coupled Convolutional Sparse Coding Network for Pansharpening

Given a low-resolution multispectral (MS) image and a high-resolution panchromatic image, the task of pansharpening is to generate a high-resolution MS image. Deep learning (DL)-based methods receive extensive attention recently. Different from the existing DL-based methods, this article proposes a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on geoscience and remote sensing Jg. 60; S. 1 - 16
1. Verfasser: Yin, Haitao
Format: Journal Article
Sprache:Englisch
Veröffentlicht: New York IEEE 2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:0196-2892, 1558-0644
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Given a low-resolution multispectral (MS) image and a high-resolution panchromatic image, the task of pansharpening is to generate a high-resolution MS image. Deep learning (DL)-based methods receive extensive attention recently. Different from the existing DL-based methods, this article proposes a novel deep neural network for pansharpening inspired by the learned iterative soft thresholding algorithm. First, a coupled convolutional sparse coding-based pansharpening (PSCSC) model and related traditional optimization algorithm are proposed. Then, following the procedures of traditional algorithm for solving PSCSC, an interpretable end-to-end deep pansharpening network is developed using a deep unfolding strategy. The designed deep architecture can also be understood in the view of details injection (DI)-based scheme. This work offers a solution that integrates the DL-, DI-, and variational optimization-based schemes into a framework. The experimental results on the reduced- and full-scale datasets demonstrate that the proposed deep pansharpening network outperforms popular traditional methods and some current DL-based methods.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0196-2892
1558-0644
DOI:10.1109/TGRS.2021.3088313