Multi-level optimal fusion algorithm for infrared and visible image

Image fusion technology has been widely used in analyzing fusion effect under various settings. This paper proposed the image fusion method suitable for both infrared and grayscale visible image. As a first step, the base and detail layers of the image are obtained through the multilayer image decom...

Full description

Saved in:
Bibliographic Details
Published in:Signal, image and video processing Vol. 17; no. 8; pp. 4209 - 4217
Main Authors: Jian, Bo-Lin, Tu, Ching-Che
Format: Journal Article
Language:English
Published: London Springer London 01.11.2023
Springer Nature B.V
Subjects:
ISSN:1863-1703, 1863-1711
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Image fusion technology has been widely used in analyzing fusion effect under various settings. This paper proposed the image fusion method suitable for both infrared and grayscale visible image. As a first step, the base and detail layers of the image are obtained through the multilayer image decomposition method. For the base layer, we select a fusion method based on the gradient weight map to address the loss of feature details inherent in the average fusion strategy. For the detail layer analysis, we use a weighted least squares-based fusion strategy to mitigate the impact of noise. In this research, the database containing various settings is used to verify the robustness of this methodology. The result is also used to compare with other types of fusion methods in order to provide subjective kind of method and objective kind of image indicator for easier verification. The fusion result indicated that this research method not only reduces noise in the infrared images but also maintains the desired global contrast. As a result, the fusion process can retrieve more feature details while preserving the structure of the feature area.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1863-1703
1863-1711
DOI:10.1007/s11760-023-02653-5