A Graph-Based Superpixel Segmentation Approach Applied to Pansharpening

In this paper, an image-driven regional pansharpening technique based on simplex optimization analysis with a graph-based superpixel segmentation strategy is proposed. This fusion approach optimally combines spatial information derived from a high-resolution panchromatic (PAN) image and spectral inf...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Sensors (Basel, Switzerland) Ročník 25; číslo 16; s. 4992
Hlavný autor: Hallabia, Hind
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Switzerland MDPI AG 12.08.2025
MDPI
Predmet:
ISSN:1424-8220, 1424-8220
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:In this paper, an image-driven regional pansharpening technique based on simplex optimization analysis with a graph-based superpixel segmentation strategy is proposed. This fusion approach optimally combines spatial information derived from a high-resolution panchromatic (PAN) image and spectral information captured from a low-resolution multispectral (MS) image to generate a unique comprehensive high-resolution MS image. As the performance of such a fusion method relies on the choice of the fusion strategy, and in particular, on the way the algorithm is used for estimating gain coefficients, our proposal is dedicated to computing the injection gains over a graph-driven segmentation map. The graph-based segments are obtained by applying simple linear iterative clustering (SLIC) on the MS image followed by a region adjacency graph (RAG) merging stage. This graphical representation of the segmentation map is used as guidance for spatial information to be injected during fusion processing. The high-resolution MS image is achieved by inferring locally the details in accordance with the local simplex injection fusion rule. The quality improvements achievable by our proposal are evaluated and validated at reduced and at full scales using two high resolution datasets collected by GeoEye-1 and WorldView-3 sensors.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s25164992