A Graph-Based Superpixel Segmentation Approach Applied to Pansharpening

In this paper, an image-driven regional pansharpening technique based on simplex optimization analysis with a graph-based superpixel segmentation strategy is proposed. This fusion approach optimally combines spatial information derived from a high-resolution panchromatic (PAN) image and spectral inf...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Sensors (Basel, Switzerland) Jg. 25; H. 16; S. 4992
1. Verfasser: Hallabia, Hind
Format: Journal Article
Sprache:Englisch
Veröffentlicht: Switzerland MDPI AG 12.08.2025
MDPI
Schlagworte:
ISSN:1424-8220, 1424-8220
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, an image-driven regional pansharpening technique based on simplex optimization analysis with a graph-based superpixel segmentation strategy is proposed. This fusion approach optimally combines spatial information derived from a high-resolution panchromatic (PAN) image and spectral information captured from a low-resolution multispectral (MS) image to generate a unique comprehensive high-resolution MS image. As the performance of such a fusion method relies on the choice of the fusion strategy, and in particular, on the way the algorithm is used for estimating gain coefficients, our proposal is dedicated to computing the injection gains over a graph-driven segmentation map. The graph-based segments are obtained by applying simple linear iterative clustering (SLIC) on the MS image followed by a region adjacency graph (RAG) merging stage. This graphical representation of the segmentation map is used as guidance for spatial information to be injected during fusion processing. The high-resolution MS image is achieved by inferring locally the details in accordance with the local simplex injection fusion rule. The quality improvements achievable by our proposal are evaluated and validated at reduced and at full scales using two high resolution datasets collected by GeoEye-1 and WorldView-3 sensors.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s25164992