Infrared and visible image fusion using modified spatial frequency-based clustered dictionary

Infrared and visible image fusion is an active area of research as it provides fused image with better scene information and sharp features. An efficient fusion of images from multisensory sources is always a challenge for researchers. In this paper, an efficient image fusion method based on sparse...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Pattern analysis and applications : PAA Ročník 24; číslo 2; s. 575 - 589
Hlavní autori: Budhiraja, Sumit, Sharma, Rajat, Agrawal, Sunil, Sohi, Balwinder S.
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: London Springer London 01.05.2021
Springer Nature B.V
Predmet:
ISSN:1433-7541, 1433-755X
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Infrared and visible image fusion is an active area of research as it provides fused image with better scene information and sharp features. An efficient fusion of images from multisensory sources is always a challenge for researchers. In this paper, an efficient image fusion method based on sparse representation with clustered dictionary is proposed for infrared and visible images. Firstly, the edge information of visible image is enhanced by using a guided filter. To extract more edge information from the source images, modified spatial frequency is used to generate a clustered dictionary from the source images. Then, non-subsampled contourlet transform (NSCT) is used to obtain low-frequency and high-frequency sub-bands of the source images. The low-frequency sub-bands are fused using sparse coding, and the high-frequency sub-bands are fused using max-absolute rule. The final fused image is obtained by using inverse NSCT. The subjective and objective evaluations show that the proposed method is able to outperform other conventional image fusion methods.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1433-7541
1433-755X
DOI:10.1007/s10044-020-00919-z