DRSNFuse: Deep Residual Shrinkage Network for Infrared and Visible Image Fusion

Infrared images are robust against illumination variation and disguises, containing the sharp edge contours of objects. Visible images are enriched with texture details. Infrared and visible image fusion seeks to obtain high-quality images, keeping the advantages of source images. This paper propose...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Sensors (Basel, Switzerland) Ročník 22; číslo 14; s. 5149
Hlavní autoři: Wang, Hongfeng, Wang, Jianzhong, Xu, Haonan, Sun, Yong, Yu, Zibo
Médium: Journal Article
Jazyk:angličtina
Vydáno: Basel MDPI AG 08.07.2022
MDPI
Témata:
ISSN:1424-8220, 1424-8220
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Infrared images are robust against illumination variation and disguises, containing the sharp edge contours of objects. Visible images are enriched with texture details. Infrared and visible image fusion seeks to obtain high-quality images, keeping the advantages of source images. This paper proposes an object-aware image fusion method based on a deep residual shrinkage network, termed as DRSNFuse. DRSNFuse exploits residual shrinkage blocks for image fusion and introduces a deeper network in infrared and visible image fusion tasks than existing methods based on fully convolutional networks. The deeper network can effectively extract semantic information, while the residual shrinkage blocks maintain the texture information throughout the whole network. The residual shrinkage blocks adapt a channel-wise attention mechanism to the fusion task, enabling feature map channels to focus on objects and backgrounds separately. A novel image fusion loss function is proposed to obtain better fusion performance and suppress artifacts. DRSNFuse trained with the proposed loss function can generate fused images with fewer artifacts and more original textures, which also satisfy the human visual system. Experiments show that our method has better fusion results than mainstream methods through quantitative comparison and obtains fused images with brighter targets, sharper edge contours, richer details, and fewer artifacts.
AbstractList Infrared images are robust against illumination variation and disguises, containing the sharp edge contours of objects. Visible images are enriched with texture details. Infrared and visible image fusion seeks to obtain high-quality images, keeping the advantages of source images. This paper proposes an object-aware image fusion method based on a deep residual shrinkage network, termed as DRSNFuse. DRSNFuse exploits residual shrinkage blocks for image fusion and introduces a deeper network in infrared and visible image fusion tasks than existing methods based on fully convolutional networks. The deeper network can effectively extract semantic information, while the residual shrinkage blocks maintain the texture information throughout the whole network. The residual shrinkage blocks adapt a channel-wise attention mechanism to the fusion task, enabling feature map channels to focus on objects and backgrounds separately. A novel image fusion loss function is proposed to obtain better fusion performance and suppress artifacts. DRSNFuse trained with the proposed loss function can generate fused images with fewer artifacts and more original textures, which also satisfy the human visual system. Experiments show that our method has better fusion results than mainstream methods through quantitative comparison and obtains fused images with brighter targets, sharper edge contours, richer details, and fewer artifacts.Infrared images are robust against illumination variation and disguises, containing the sharp edge contours of objects. Visible images are enriched with texture details. Infrared and visible image fusion seeks to obtain high-quality images, keeping the advantages of source images. This paper proposes an object-aware image fusion method based on a deep residual shrinkage network, termed as DRSNFuse. DRSNFuse exploits residual shrinkage blocks for image fusion and introduces a deeper network in infrared and visible image fusion tasks than existing methods based on fully convolutional networks. The deeper network can effectively extract semantic information, while the residual shrinkage blocks maintain the texture information throughout the whole network. The residual shrinkage blocks adapt a channel-wise attention mechanism to the fusion task, enabling feature map channels to focus on objects and backgrounds separately. A novel image fusion loss function is proposed to obtain better fusion performance and suppress artifacts. DRSNFuse trained with the proposed loss function can generate fused images with fewer artifacts and more original textures, which also satisfy the human visual system. Experiments show that our method has better fusion results than mainstream methods through quantitative comparison and obtains fused images with brighter targets, sharper edge contours, richer details, and fewer artifacts.
Infrared images are robust against illumination variation and disguises, containing the sharp edge contours of objects. Visible images are enriched with texture details. Infrared and visible image fusion seeks to obtain high-quality images, keeping the advantages of source images. This paper proposes an object-aware image fusion method based on a deep residual shrinkage network, termed as DRSNFuse. DRSNFuse exploits residual shrinkage blocks for image fusion and introduces a deeper network in infrared and visible image fusion tasks than existing methods based on fully convolutional networks. The deeper network can effectively extract semantic information, while the residual shrinkage blocks maintain the texture information throughout the whole network. The residual shrinkage blocks adapt a channel-wise attention mechanism to the fusion task, enabling feature map channels to focus on objects and backgrounds separately. A novel image fusion loss function is proposed to obtain better fusion performance and suppress artifacts. DRSNFuse trained with the proposed loss function can generate fused images with fewer artifacts and more original textures, which also satisfy the human visual system. Experiments show that our method has better fusion results than mainstream methods through quantitative comparison and obtains fused images with brighter targets, sharper edge contours, richer details, and fewer artifacts.
Author Yu, Zibo
Wang, Hongfeng
Wang, Jianzhong
Sun, Yong
Xu, Haonan
AuthorAffiliation 1 School of Mechatronical Engineering, Beijing Institute of Technology, Beijing 100081, China; 3120185177@bit.edu.cn (H.W.); 3120200259@bit.edu.cn (H.X.); 3120195181@bit.edu.cn (Y.S.); 3120200269@bit.edu.cn (Z.Y.)
2 State Key Laboratory of Explosion Science and Technology, Beijing Institute of Technology, Beijing 100081, China
AuthorAffiliation_xml – name: 2 State Key Laboratory of Explosion Science and Technology, Beijing Institute of Technology, Beijing 100081, China
– name: 1 School of Mechatronical Engineering, Beijing Institute of Technology, Beijing 100081, China; 3120185177@bit.edu.cn (H.W.); 3120200259@bit.edu.cn (H.X.); 3120195181@bit.edu.cn (Y.S.); 3120200269@bit.edu.cn (Z.Y.)
Author_xml – sequence: 1
  givenname: Hongfeng
  surname: Wang
  fullname: Wang, Hongfeng
– sequence: 2
  givenname: Jianzhong
  surname: Wang
  fullname: Wang, Jianzhong
– sequence: 3
  givenname: Haonan
  surname: Xu
  fullname: Xu, Haonan
– sequence: 4
  givenname: Yong
  surname: Sun
  fullname: Sun, Yong
– sequence: 5
  givenname: Zibo
  surname: Yu
  fullname: Yu, Zibo
BookMark eNplkcluFDEQQC2UiCxw4A8scYHDEG_d7eaAhBICI0WJlERcLS_liSc99sTuBvH3eJgQZTlVqfzqlVV1gHZiioDQO0o-cd6To8IYFQ0V_Su0TwUTM8kY2XmU76GDUpaEMM65fI32eCN7IpncRxcnl1fnp1OBz_gEYI0voQQ36QFf3eQQb_UC8DmMv1O-xT5lPI8-6wwO6-jwz1CCGQDPVxusSkKKb9Cu10OBt_fxEF2ffrs-_jE7u_g-P_56NrNCtOPMGEmsbLT0VAtngFoJ1HtBbMtb4EZ3RhBnGWPgrBatMbRmIMELJzvOD9F8q3VJL9U6h5XOf1TSQf0rpLxQOo_BDqBoSzppiOeukYKTrqfOS--INaTzutm4vmxd68ms6jiIY9bDE-nTlxhu1CL9Uj2nUvRtFXy4F-R0N0EZ1SoUC8OgI6SpKNb2DZM9Zayi75-hyzTlWDe1oQRpm1aSSh1tKZtTKRm8smHUY11vnR8GRYna3F093L12fHzW8f_7L9m_BwitNQ
CitedBy_id crossref_primary_10_3390_s23167097
crossref_primary_10_3390_act14020050
crossref_primary_10_3390_s24175860
crossref_primary_10_1109_TGRS_2024_3500036
crossref_primary_10_1016_j_infrared_2023_104796
Cites_doi 10.3390/drones5040133
10.1049/elp2.12147
10.1109/ICASSP40776.2020.9054071
10.1007/s00170-020-06173-1
10.1016/j.inffus.2015.11.003
10.1109/TCI.2020.2965304
10.1016/j.infrared.2017.02.005
10.1016/j.inffus.2010.03.002
10.1109/ICIIP.2011.6108966
10.1016/j.inffus.2019.07.005
10.1007/978-3-030-01261-8_13
10.3390/s22124416
10.1109/ACCESS.2017.2735019
10.1109/CVPR.2017.632
10.1109/CVPR.2017.106
10.1109/TIP.2003.819861
10.1364/JOSAA.34.001400
10.1109/CMMNO53328.2021.9467549
10.3390/s22114249
10.1109/ACCESS.2017.2735865
10.1109/JSEN.2015.2478655
10.1109/TIP.2018.2887342
10.1016/j.bspc.2017.02.005
10.1109/ICCV.2017.322
10.1109/ICPR.2018.8546006
10.1109/CVPR.2016.90
10.1109/TMM.2019.2895292
10.1109/TII.2019.2943898
10.1016/j.inffus.2018.09.004
10.1109/CVPR.2017.19
10.24963/ijcai.2020/135
10.1117/1.OE.51.1.010901
ContentType Journal Article
Copyright 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
2022 by the authors. 2022
Copyright_xml – notice: 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
– notice: 2022 by the authors. 2022
DBID AAYXX
CITATION
3V.
7X7
7XB
88E
8FI
8FJ
8FK
ABUWG
AFKRA
AZQEC
BENPR
CCPQU
DWQXO
FYUFA
GHDGH
K9.
M0S
M1P
PHGZM
PHGZT
PIMPY
PJZUB
PKEHL
PPXIY
PQEST
PQQKQ
PQUKI
PRINS
7X8
5PM
DOA
DOI 10.3390/s22145149
DatabaseName CrossRef
ProQuest Central (Corporate)
Health & Medical Collection
ProQuest Central (purchase pre-March 2016)
Medical Database (Alumni Edition)
Hospital Premium Collection
Hospital Premium Collection (Alumni Edition)
ProQuest Central (Alumni) (purchase pre-March 2016)
ProQuest Central (Alumni)
ProQuest Central UK/Ireland
ProQuest Central Essentials
ProQuest Central
ProQuest One
ProQuest Central
Health Research Premium Collection
Health Research Premium Collection (Alumni)
ProQuest Health & Medical Complete (Alumni)
Health & Medical Collection (Alumni Edition)
PML(ProQuest Medical Library)
ProQuest Central Premium
ProQuest One Academic
Publicly Available Content Database
ProQuest Health & Medical Research Collection
ProQuest One Academic Middle East (New)
ProQuest One Health & Nursing
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic (retired)
ProQuest One Academic UKI Edition
ProQuest Central China
MEDLINE - Academic
PubMed Central (Full Participant titles)
DOAJ Directory of Open Access Journals
DatabaseTitle CrossRef
Publicly Available Content Database
ProQuest One Academic Middle East (New)
ProQuest Central Essentials
ProQuest Health & Medical Complete (Alumni)
ProQuest Central (Alumni Edition)
ProQuest One Community College
ProQuest One Health & Nursing
ProQuest Central China
ProQuest Central
ProQuest Health & Medical Research Collection
Health Research Premium Collection
Health and Medicine Complete (Alumni Edition)
ProQuest Central Korea
Health & Medical Research Collection
ProQuest Central (New)
ProQuest Medical Library (Alumni)
ProQuest One Academic Eastern Edition
ProQuest Hospital Collection
Health Research Premium Collection (Alumni)
ProQuest Hospital Collection (Alumni)
ProQuest Health & Medical Complete
ProQuest Medical Library
ProQuest One Academic UKI Edition
ProQuest One Academic
ProQuest One Academic (New)
ProQuest Central (Alumni)
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
Publicly Available Content Database
CrossRef


Database_xml – sequence: 1
  dbid: DOA
  name: DOAJ Directory of Open Access Journals
  url: https://www.doaj.org/
  sourceTypes: Open Website
– sequence: 2
  dbid: PIMPY
  name: Publicly Available Content Database
  url: http://search.proquest.com/publiccontent
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 1424-8220
ExternalDocumentID oai_doaj_org_article_16078b0f3d58430791df8fd0cb07fa53
PMC9318496
10_3390_s22145149
GrantInformation_xml – fundername: Defense Industrial Technology Development Program
  grantid: JCKY2019602C015
GroupedDBID ---
123
2WC
53G
5VS
7X7
88E
8FE
8FG
8FI
8FJ
AADQD
AAHBH
AAYXX
ABDBF
ABUWG
ACUHS
ADBBV
ADMLS
AENEX
AFFHD
AFKRA
AFZYC
ALMA_UNASSIGNED_HOLDINGS
BENPR
BPHCQ
BVXVI
CCPQU
CITATION
CS3
D1I
DU5
E3Z
EBD
ESX
F5P
FYUFA
GROUPED_DOAJ
GX1
HH5
HMCUK
HYE
IAO
ITC
KQ8
L6V
M1P
M48
MODMG
M~E
OK1
OVT
P2P
P62
PHGZM
PHGZT
PIMPY
PJZUB
PPXIY
PQQKQ
PROAC
PSQYO
RNS
RPM
TUS
UKHRP
XSB
~8M
3V.
7XB
8FK
AZQEC
DWQXO
K9.
PKEHL
PQEST
PQUKI
PRINS
7X8
PUEGO
5PM
ID FETCH-LOGICAL-c446t-bb80c85a8f1a4dbe1c8e1ff40c636e3ba7b40dc222edca46bb12ede8ef4d8733
IEDL.DBID 7X7
ISICitedReferencesCount 7
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000832063800001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1424-8220
IngestDate Fri Oct 03 12:53:21 EDT 2025
Tue Nov 04 01:58:11 EST 2025
Thu Oct 02 05:23:34 EDT 2025
Tue Oct 07 07:39:20 EDT 2025
Tue Nov 18 22:30:40 EST 2025
Sat Nov 29 07:18:17 EST 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 14
Language English
License Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c446t-bb80c85a8f1a4dbe1c8e1ff40c636e3ba7b40dc222edca46bb12ede8ef4d8733
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
OpenAccessLink https://www.proquest.com/docview/2694065680?pq-origsite=%requestingapplication%
PMID 35890828
PQID 2694065680
PQPubID 2032333
ParticipantIDs doaj_primary_oai_doaj_org_article_16078b0f3d58430791df8fd0cb07fa53
pubmedcentral_primary_oai_pubmedcentral_nih_gov_9318496
proquest_miscellaneous_2695289122
proquest_journals_2694065680
crossref_citationtrail_10_3390_s22145149
crossref_primary_10_3390_s22145149
PublicationCentury 2000
PublicationDate 20220708
PublicationDateYYYYMMDD 2022-07-08
PublicationDate_xml – month: 7
  year: 2022
  text: 20220708
  day: 8
PublicationDecade 2020
PublicationPlace Basel
PublicationPlace_xml – name: Basel
PublicationTitle Sensors (Basel, Switzerland)
PublicationYear 2022
Publisher MDPI AG
MDPI
Publisher_xml – name: MDPI AG
– name: MDPI
References Dogra (ref_4) 2017; 5
Zong (ref_9) 2017; 34
Ma (ref_23) 2021; 70
Li (ref_16) 2018; 28
Li (ref_6) 2011; 12
ref_14
ref_36
ref_33
ref_10
ref_32
ref_30
Kotsiopoulos (ref_3) 2021; 115
ref_19
Xu (ref_39) 2020; 34
ref_18
Zhao (ref_31) 2019; 16
ref_17
ref_15
Ma (ref_24) 2020; 54
Zhang (ref_11) 2017; 34
Wang (ref_37) 2004; 13
Yang (ref_35) 2020; 2020
Toet (ref_40) 2012; 51
ref_25
Hu (ref_34) 2022; 16
ref_21
Ma (ref_38) 2017; 82
ref_1
ref_2
Zhou (ref_8) 2016; 30
Ma (ref_20) 2020; 70
ref_29
ref_28
ref_27
ref_26
Hou (ref_22) 2020; 6
Ma (ref_13) 2019; 48
Du (ref_5) 2017; 5
Guo (ref_12) 2019; 21
Bavirisetti (ref_7) 2015; 16
References_xml – ident: ref_28
– volume: 70
  start-page: 1
  year: 2020
  ident: ref_20
  article-title: GANMcC: A generative adversarial network with multiclassification constraints for infrared and visible image fusion
  publication-title: IEEE Trans. Instrum. Meas.
– ident: ref_32
  doi: 10.3390/drones5040133
– volume: 16
  start-page: 206
  year: 2022
  ident: ref_34
  article-title: A novel method for transformer fault diagnosis based on refined deep residual shrinkage network
  publication-title: IET Electr. Power Appl.
  doi: 10.1049/elp2.12147
– ident: ref_17
  doi: 10.1109/ICASSP40776.2020.9054071
– volume: 115
  start-page: 823
  year: 2021
  ident: ref_3
  article-title: Deep multi-sensorial data analysis for production monitoring in hard metal industry
  publication-title: Int. J. Adv. Manuf. Technol.
  doi: 10.1007/s00170-020-06173-1
– volume: 30
  start-page: 15
  year: 2016
  ident: ref_8
  article-title: Perceptual fusion of infrared and visible images through a hybrid multi-scale decomposition with Gaussian and bilateral filters
  publication-title: Inf. Fusion
  doi: 10.1016/j.inffus.2015.11.003
– volume: 6
  start-page: 640
  year: 2020
  ident: ref_22
  article-title: VIF-Net: An unsupervised framework for infrared and visible image fusion
  publication-title: IEEE Trans. Comput. Imaging
  doi: 10.1109/TCI.2020.2965304
– volume: 82
  start-page: 8
  year: 2017
  ident: ref_38
  article-title: Infrared and visible image fusion based on visual saliency map and weighted least square optimization
  publication-title: Infrared Phys. Technol.
  doi: 10.1016/j.infrared.2017.02.005
– volume: 12
  start-page: 74
  year: 2011
  ident: ref_6
  article-title: Performance comparison of different multi-resolution transforms for image fusion
  publication-title: Inf. Fusion
  doi: 10.1016/j.inffus.2010.03.002
– ident: ref_10
  doi: 10.1109/ICIIP.2011.6108966
– volume: 54
  start-page: 85
  year: 2020
  ident: ref_24
  article-title: Infrared and visible image fusion via detail preserving adversarial learning
  publication-title: Inf. Fusion
  doi: 10.1016/j.inffus.2019.07.005
– ident: ref_26
  doi: 10.1007/978-3-030-01261-8_13
– ident: ref_1
  doi: 10.3390/s22124416
– volume: 5
  start-page: 15750
  year: 2017
  ident: ref_5
  article-title: Image Segmentation-Based Multi-Focus Image Fusion Through Multi-Scale Convolutional Neural Network
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2017.2735019
– ident: ref_18
  doi: 10.1109/CVPR.2017.632
– volume: 70
  start-page: 1
  year: 2021
  ident: ref_23
  article-title: STDFusionNet: An infrared and visible image fusion network based on salient target detection
  publication-title: IEEE Trans. Instrum. Meas.
– ident: ref_36
  doi: 10.1109/CVPR.2017.106
– volume: 13
  start-page: 600
  year: 2004
  ident: ref_37
  article-title: Image quality assessment: From error visibility to structural similarity
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2003.819861
– volume: 34
  start-page: 12484
  year: 2020
  ident: ref_39
  article-title: FusionDN: A Unified Densely Connected Network for Image Fusion
  publication-title: Proc. Aaai Conf. Artif. Intell.
– volume: 34
  start-page: 1400
  year: 2017
  ident: ref_11
  article-title: Infrared and visible image fusion via saliency analysis and local edge-preserving multi-scale decomposition
  publication-title: JOSA A
  doi: 10.1364/JOSAA.34.001400
– ident: ref_25
– ident: ref_29
– ident: ref_33
  doi: 10.1109/CMMNO53328.2021.9467549
– volume: 2020
  start-page: 8880960
  year: 2020
  ident: ref_35
  article-title: Fault diagnosis of rotating machinery based on one-dimensional deep residual shrinkage network with a wide convolution layer
  publication-title: Shock Vib.
– ident: ref_2
  doi: 10.3390/s22114249
– volume: 5
  start-page: 16040
  year: 2017
  ident: ref_4
  article-title: From Multi-scale Decomposition to Non-multi-scale Decomposition Methods: A Comprehensive Survey of Image Fusion Techniques and its Applications
  publication-title: IEEE Access
  doi: 10.1109/ACCESS.2017.2735865
– volume: 16
  start-page: 203
  year: 2015
  ident: ref_7
  article-title: Fusion of infrared and visible sensor images based on anisotropic diffusion and Karhunen-Loeve transform
  publication-title: IEEE Sens. J.
  doi: 10.1109/JSEN.2015.2478655
– volume: 28
  start-page: 2614
  year: 2018
  ident: ref_16
  article-title: DenseFuse: A fusion approach to infrared and visible images
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2018.2887342
– volume: 34
  start-page: 195
  year: 2017
  ident: ref_9
  article-title: Medical image fusion based on sparse representation of classified image patches
  publication-title: Biomed. Signal Process. Control.
  doi: 10.1016/j.bspc.2017.02.005
– ident: ref_30
  doi: 10.1109/ICCV.2017.322
– ident: ref_15
– ident: ref_14
  doi: 10.1109/ICPR.2018.8546006
– ident: ref_27
  doi: 10.1109/CVPR.2016.90
– volume: 21
  start-page: 1982
  year: 2019
  ident: ref_12
  article-title: FuseGAN: Learning to fuse multi-focus image via conditional generative adversarial network
  publication-title: IEEE Trans. Multimed.
  doi: 10.1109/TMM.2019.2895292
– volume: 16
  start-page: 4681
  year: 2019
  ident: ref_31
  article-title: Deep residual shrinkage networks for fault diagnosis
  publication-title: IEEE Trans. Ind. Inform.
  doi: 10.1109/TII.2019.2943898
– volume: 48
  start-page: 11
  year: 2019
  ident: ref_13
  article-title: FusionGAN: A generative adversarial network for infrared and visible image fusion
  publication-title: Inf. Fusion
  doi: 10.1016/j.inffus.2018.09.004
– ident: ref_19
  doi: 10.1109/CVPR.2017.19
– ident: ref_21
  doi: 10.24963/ijcai.2020/135
– volume: 51
  start-page: 010901
  year: 2012
  ident: ref_40
  article-title: Progress in color night vision
  publication-title: Opt. Eng.
  doi: 10.1117/1.OE.51.1.010901
SSID ssj0023338
Score 2.4026194
Snippet Infrared images are robust against illumination variation and disguises, containing the sharp edge contours of objects. Visible images are enriched with...
SourceID doaj
pubmedcentral
proquest
crossref
SourceType Open Website
Open Access Repository
Aggregation Database
Enrichment Source
Index Database
StartPage 5149
SubjectTerms Algorithms
artificial texture suppression
auto encoder and decoder
channel-wise attention mechanism
Deep learning
deep residual shrinkage network
image fusion
Neural networks
Semantics
SummonAdditionalLinks – databaseName: DOAJ Directory of Open Access Journals
  dbid: DOA
  link: http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1LS8QwEB5EPOhBfGJ1lSgevBTbppum3nwtCrKKingrSTrBRa3Ldtff76TtLlsQvHgryUebzCSZmXTyBeBY84i8ChS-Ecb6ZKG5L7VQfsSdt9FNpaop8--Sfl--vqYPc1d9uZywmh64FtypI0CTOrA8J1NJAzINcyttHhgdJFZ1K55PKp0GU02oxSnyqnmE6JvBaRk5Qu7QEWbOWZ-KpL_lWbbzIucMTW8NVhsPkZ3XLVuHBSw2YGWON3AT7q8en_q9SYln7ApxyB6xrM5Usac3ArzTEsH6dXo3I5-U3RZ25PLMmSpy9jKgSfCB7PbTweglpJgteO5dP1_e-M3NCL6h8G3say0DI7tK2lDFucbQSAytjQMjuECuVaLjIDdk-6kzKhZah_SEEm2cy4TzbVgsvgrcAUYILnWMUrlQL9c6jYXVmJLZpxKlPTiZCiwzDWu4u7ziI6Powck2m8nWg6MZdFhTZfwGunBSnwEcu3VVQDrPGp1nf-ncg85UZ1kz5crMHcklf0rIwIPDWTVNFvcHRBX4NakwXYowwyjyIGnputWgdk0xeKtot1Na_uJU7P5HD_ZgOXLnKNw-sezA4ng0wX1YMt_jQTk6qMbyD2eB-rE
  priority: 102
  providerName: Directory of Open Access Journals
Title DRSNFuse: Deep Residual Shrinkage Network for Infrared and Visible Image Fusion
URI https://www.proquest.com/docview/2694065680
https://www.proquest.com/docview/2695289122
https://pubmed.ncbi.nlm.nih.gov/PMC9318496
https://doaj.org/article/16078b0f3d58430791df8fd0cb07fa53
Volume 22
WOSCitedRecordID wos000832063800001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVAON
  databaseName: DOAJ Directory of Open Access Journals
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: DOA
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://www.doaj.org/
  providerName: Directory of Open Access Journals
– providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: M~E
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
– providerCode: PRVPQU
  databaseName: Health & Medical Collection
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: 7X7
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://search.proquest.com/healthcomplete
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: ProQuest Central (NC Live)
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: BENPR
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: https://www.proquest.com/central
  providerName: ProQuest
– providerCode: PRVPQU
  databaseName: Publicly Available Content Database
  customDbUrl:
  eissn: 1424-8220
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0023338
  issn: 1424-8220
  databaseCode: PIMPY
  dateStart: 20010101
  isFulltext: true
  titleUrlDefault: http://search.proquest.com/publiccontent
  providerName: ProQuest
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3Nb9MwFH-CjQMc-EaEjcogDlyiJXGaOLtMG2tFJRaqbkLlFNmOzSpGWpqWI3877zlp10iICxcrsp9iy-_D7_nj9wDeKR6hV2ESXyfa-rhCc1-oRPoRJ2-jnwnZQOZ_SvNcTKfZuN1wq9trlRub6Ax1Ode0R35ELy5xuUxEcLL46VPWKDpdbVNo3IV9SptNcp5ObwMujvFXgyaEPQdHdUSw3CHBZu6sQQ6qv-Nfdm9H7iw3w0f_O9DH8LB1NNlpIxlP4I6pnsKDHfjBZ_D5fHKZD9e1OWbnxizYxNTuaRa7vEaC72hpWN7cEmfo2rJRZZd0XZ3JqmRfZqhLN4aNfhAZ_gT5-xyuhoOrDx_9NsGCrzEKXPlKiUCLvhQ2lHGpTKiFCa2NA53wxHAlUxUHpUYXAmdDxolSIX4ZYWxc4mzzF7BXzSvzEhhScKFiIyRFjKVSWZxYZTL0HrBGKg_eb2a80C34OOXAuCkwCCHmFFvmePB2S7poEDf-RnRGbNsSEEi2q5gvvxWtzhWEnSdUYHmJXhbasiwsrbBloFWQWtnnHhxuOFi0mlsXt-zz4M22GXWODlJkZeZrR9PHQDWMIg_SjrB0BtRtqWbXDr07QysaZ8mrf3d-APcjemhBG8niEPZWy7V5Dff0r9WsXvacmLtS9GD_bJCPJz23m4Dlxe8B1o1HF-OvfwDSDg9t
linkProvider ProQuest
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1Lb9NAEB5VBQk4UF4VhgILAomLVb9irytVCAhRowaD2gjlZu2TRhQnxAmIH8V_7IwfaSwhbj1ws9Yj22t_-83MevcbgJcyDDCqMLGrYmVd9NChy2Us3CCkaKOXclFL5o-SLOOTSfp5C_60e2FoWWXLiRVR65miOfJ92nGJ7jLm3pv5D5eqRtHf1baERg2LY_P7F6Zs5eGwj9_3VRAMPozfH7lNVQFXYeqzdKXknuI9wa0vIi2Nr7jxrY08FYexCaVIZORphX7TaCWiWEofjww3NtI8oflPZPxrEXURh08yuczvQkz3avEi7Ki3XwakAu6TSueGy6sqA3TC2e5izA3vNtj5z97LHbjdhNHsbY37u7Blintwa0Nc8T586p-cZoNVaQ5Y35g5OzFltfGMnZ6hwTfkUZbVa-AZBu5sWNgFLcZnotDsyxSZ4tyw4Xcyw4sgeh_A-Co6tAvbxawwD4GhRchlZLigfFhLmUaxlSbF2AhbhHTgdfuBc9VIq1OFj_McUyzCQr7GggMv1qbzWk_kb0bvCCVrA5IArxpmi695wyg5KQNy6dlQYwyJTJ362nKrPSW9xIpe6MBeC5i84aUyv0SLA8_Xp5FR6DeRKMxsVdn0MA33g8CBpIPNzgN1zxTTs0qbPEUfEaXxo3_f_BncOBp_HOWjYXb8GG4GtKWEpsz5HmwvFyvzBK6rn8tpuXhajTAG-RUj9wLOj2lj
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw1V1bi9NQEB6Wrog-eBejqx5FwZfQ3JqeCCJqLYZdY9ldZH0K5-oW17Q2reJP8985k5sNiG_74FtIhiQn-fLNzMmcbwCeyDDAqMLEroqVddFDhy6XsXCDkKKNUcJFLZl_MM4yfnKSzHbgV7sWhsoqW06siFovFM2RD2nFJbrLmHtD25RFzCbTl8tvLnWQoj-tbTuNGiL75ucPTN_KF-kE3_XTIJi-PX7zzm06DLgK06C1KyX3FB8Jbn0RaWl8xY1vbeSpOIxNKMVYRp5W6EONViKKpfRxy3BjI83HNBeK7L-LEXkUDGB3lr6ffeqyvRCTv1rKCIftDcuANMF90uzccoBVn4BecNsvzdzyddOr__FTugZXmgCbvaq_iOuwY4obcHlLdvEmfJgcHmXTTWmes4kxS3ZoympJGjs6RYMvyLAsq6vjGYb0LC3sisr0mSg0-zhHDjkzLP1KZngSxPUtOD6PAd2GQbEozB1gaBFyGRkuKFPWUiZRbKVJMGrCPUI68Kx92blqRNep98dZjskX4SLvcOHA4850WSuN_M3oNSGmMyBx8GrHYvU5b7gmJ81ALj0baowukcMTX1tutaekN7ZiFDqw14InbxirzP8gx4FH3WHkGvqBJAqz2FQ2I0zQ_SBwYNzDae-G-keK-WmlWp6g94iS-O6_L_4QLiJg84M0278HlwJaa0Jz6XwPBuvVxtyHC-r7el6uHjSfG4P8nKH7G_zTc7I
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=DRSNFuse%3A+Deep+Residual+Shrinkage+Network+for+Infrared+and+Visible+Image+Fusion&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Wang%2C+Hongfeng&rft.au=Wang%2C+Jianzhong&rft.au=Xu%2C+Haonan&rft.au=Sun%2C+Yong&rft.date=2022-07-08&rft.issn=1424-8220&rft.eissn=1424-8220&rft.volume=22&rft.issue=14&rft.spage=5149&rft_id=info:doi/10.3390%2Fs22145149&rft.externalDBID=n%2Fa&rft.externalDocID=10_3390_s22145149
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon