Multi-level optimal fusion algorithm for infrared and visible image

Image fusion technology has been widely used in analyzing fusion effect under various settings. This paper proposed the image fusion method suitable for both infrared and grayscale visible image. As a first step, the base and detail layers of the image are obtained through the multilayer image decom...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Signal, image and video processing Jg. 17; H. 8; S. 4209 - 4217
Hauptverfasser: Jian, Bo-Lin, Tu, Ching-Che
Format: Journal Article
Sprache:Englisch
Veröffentlicht: London Springer London 01.11.2023
Springer Nature B.V
Schlagworte:
ISSN:1863-1703, 1863-1711
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Image fusion technology has been widely used in analyzing fusion effect under various settings. This paper proposed the image fusion method suitable for both infrared and grayscale visible image. As a first step, the base and detail layers of the image are obtained through the multilayer image decomposition method. For the base layer, we select a fusion method based on the gradient weight map to address the loss of feature details inherent in the average fusion strategy. For the detail layer analysis, we use a weighted least squares-based fusion strategy to mitigate the impact of noise. In this research, the database containing various settings is used to verify the robustness of this methodology. The result is also used to compare with other types of fusion methods in order to provide subjective kind of method and objective kind of image indicator for easier verification. The fusion result indicated that this research method not only reduces noise in the infrared images but also maintains the desired global contrast. As a result, the fusion process can retrieve more feature details while preserving the structure of the feature area.
Bibliographie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-023-02653-5