Dense Nested Attention Network for Infrared Small Target Detection

Single-frame infrared small target (SIRST) detection aims at separating small targets from clutter backgrounds. With the advances of deep learning, CNN-based methods have yielded promising results in generic object detection due to their powerful modeling capability. However, existing CNN-based meth...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing Jg. 32; S. 1745 - 1758
Hauptverfasser: Li, Boyang, Xiao, Chao, Wang, Longguang, Wang, Yingqian, Lin, Zaiping, Li, Miao, An, Wei, Guo, Yulan
Format: Journal Article
Sprache:Englisch
Veröffentlicht: United States IEEE 2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Schlagworte:
ISSN:1057-7149, 1941-0042, 1941-0042
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Abstract Single-frame infrared small target (SIRST) detection aims at separating small targets from clutter backgrounds. With the advances of deep learning, CNN-based methods have yielded promising results in generic object detection due to their powerful modeling capability. However, existing CNN-based methods cannot be directly applied to infrared small targets since pooling layers in their networks could lead to the loss of targets in deep layers. To handle this problem, we propose a dense nested attention network (DNA-Net) in this paper. Specifically, we design a dense nested interactive module (DNIM) to achieve progressive interaction among high-level and low-level features. With the repetitive interaction in DNIM, the information of infrared small targets in deep layers can be maintained. Based on DNIM, we further propose a cascaded channel and spatial attention module (CSAM) to adaptively enhance multi-level features. With our DNA-Net, contextual information of small targets can be well incorporated and fully exploited by repetitive fusion and enhancement. Moreover, we develop an infrared small target dataset (namely, NUDT-SIRST) and propose a set of evaluation metrics to conduct comprehensive performance evaluation. Experiments on both public and our self-developed datasets demonstrate the effectiveness of our method. Compared to other state-of-the-art methods, our method achieves better performance in terms of probability of detection (<inline-formula> <tex-math notation="LaTeX">{P}_{d} </tex-math></inline-formula>), false-alarm rate (<inline-formula> <tex-math notation="LaTeX">{F}_{a} </tex-math></inline-formula>), and intersection of union (<inline-formula> <tex-math notation="LaTeX">IoU </tex-math></inline-formula>).
AbstractList Single-frame infrared small target (SIRST) detection aims at separating small targets from clutter backgrounds. With the advances of deep learning, CNN-based methods have yielded promising results in generic object detection due to their powerful modeling capability. However, existing CNN-based methods cannot be directly applied to infrared small targets since pooling layers in their networks could lead to the loss of targets in deep layers. To handle this problem, we propose a dense nested attention network (DNA-Net) in this paper. Specifically, we design a dense nested interactive module (DNIM) to achieve progressive interaction among high-level and low-level features. With the repetitive interaction in DNIM, the information of infrared small targets in deep layers can be maintained. Based on DNIM, we further propose a cascaded channel and spatial attention module (CSAM) to adaptively enhance multi-level features. With our DNA-Net, contextual information of small targets can be well incorporated and fully exploited by repetitive fusion and enhancement. Moreover, we develop an infrared small target dataset (namely, NUDT-SIRST) and propose a set of evaluation metrics to conduct comprehensive performance evaluation. Experiments on both public and our self-developed datasets demonstrate the effectiveness of our method. Compared to other state-of-the-art methods, our method achieves better performance in terms of probability of detection (<inline-formula> <tex-math notation="LaTeX">{P}_{d} </tex-math></inline-formula>), false-alarm rate (<inline-formula> <tex-math notation="LaTeX">{F}_{a} </tex-math></inline-formula>), and intersection of union (<inline-formula> <tex-math notation="LaTeX">IoU </tex-math></inline-formula>).
Single-frame infrared small target (SIRST) detection aims at separating small targets from clutter backgrounds. With the advances of deep learning, CNN-based methods have yielded promising results in generic object detection due to their powerful modeling capability. However, existing CNN-based methods cannot be directly applied to infrared small targets since pooling layers in their networks could lead to the loss of targets in deep layers. To handle this problem, we propose a dense nested attention network (DNA-Net) in this paper. Specifically, we design a dense nested interactive module (DNIM) to achieve progressive interaction among high-level and low-level features. With the repetitive interaction in DNIM, the information of infrared small targets in deep layers can be maintained. Based on DNIM, we further propose a cascaded channel and spatial attention module (CSAM) to adaptively enhance multi-level features. With our DNA-Net, contextual information of small targets can be well incorporated and fully exploited by repetitive fusion and enhancement. Moreover, we develop an infrared small target dataset (namely, NUDT-SIRST) and propose a set of evaluation metrics to conduct comprehensive performance evaluation. Experiments on both public and our self-developed datasets demonstrate the effectiveness of our method. Compared to other state-of-the-art methods, our method achieves better performance in terms of probability of detection ( P ), false-alarm rate ( F ), and intersection of union ( IoU ).
Single-frame infrared small target (SIRST) detection aims at separating small targets from clutter backgrounds. With the advances of deep learning, CNN-based methods have yielded promising results in generic object detection due to their powerful modeling capability. However, existing CNN-based methods cannot be directly applied to infrared small targets since pooling layers in their networks could lead to the loss of targets in deep layers. To handle this problem, we propose a dense nested attention network (DNA-Net) in this paper. Specifically, we design a dense nested interactive module (DNIM) to achieve progressive interaction among high-level and low-level features. With the repetitive interaction in DNIM, the information of infrared small targets in deep layers can be maintained. Based on DNIM, we further propose a cascaded channel and spatial attention module (CSAM) to adaptively enhance multi-level features. With our DNA-Net, contextual information of small targets can be well incorporated and fully exploited by repetitive fusion and enhancement. Moreover, we develop an infrared small target dataset (namely, NUDT-SIRST) and propose a set of evaluation metrics to conduct comprehensive performance evaluation. Experiments on both public and our self-developed datasets demonstrate the effectiveness of our method. Compared to other state-of-the-art methods, our method achieves better performance in terms of probability of detection ( Pd ), false-alarm rate ( Fa ), and intersection of union ( IoU ).Single-frame infrared small target (SIRST) detection aims at separating small targets from clutter backgrounds. With the advances of deep learning, CNN-based methods have yielded promising results in generic object detection due to their powerful modeling capability. However, existing CNN-based methods cannot be directly applied to infrared small targets since pooling layers in their networks could lead to the loss of targets in deep layers. To handle this problem, we propose a dense nested attention network (DNA-Net) in this paper. Specifically, we design a dense nested interactive module (DNIM) to achieve progressive interaction among high-level and low-level features. With the repetitive interaction in DNIM, the information of infrared small targets in deep layers can be maintained. Based on DNIM, we further propose a cascaded channel and spatial attention module (CSAM) to adaptively enhance multi-level features. With our DNA-Net, contextual information of small targets can be well incorporated and fully exploited by repetitive fusion and enhancement. Moreover, we develop an infrared small target dataset (namely, NUDT-SIRST) and propose a set of evaluation metrics to conduct comprehensive performance evaluation. Experiments on both public and our self-developed datasets demonstrate the effectiveness of our method. Compared to other state-of-the-art methods, our method achieves better performance in terms of probability of detection ( Pd ), false-alarm rate ( Fa ), and intersection of union ( IoU ).
Single-frame infrared small target (SIRST) detection aims at separating small targets from clutter backgrounds. With the advances of deep learning, CNN-based methods have yielded promising results in generic object detection due to their powerful modeling capability. However, existing CNN-based methods cannot be directly applied to infrared small targets since pooling layers in their networks could lead to the loss of targets in deep layers. To handle this problem, we propose a dense nested attention network (DNA-Net) in this paper. Specifically, we design a dense nested interactive module (DNIM) to achieve progressive interaction among high-level and low-level features. With the repetitive interaction in DNIM, the information of infrared small targets in deep layers can be maintained. Based on DNIM, we further propose a cascaded channel and spatial attention module (CSAM) to adaptively enhance multi-level features. With our DNA-Net, contextual information of small targets can be well incorporated and fully exploited by repetitive fusion and enhancement. Moreover, we develop an infrared small target dataset (namely, NUDT-SIRST) and propose a set of evaluation metrics to conduct comprehensive performance evaluation. Experiments on both public and our self-developed datasets demonstrate the effectiveness of our method. Compared to other state-of-the-art methods, our method achieves better performance in terms of probability of detection ([Formula Omitted]), false-alarm rate ([Formula Omitted]), and intersection of union ([Formula Omitted]).
Author Li, Boyang
Wang, Yingqian
Xiao, Chao
Guo, Yulan
Wang, Longguang
Lin, Zaiping
An, Wei
Li, Miao
Author_xml – sequence: 1
  givenname: Boyang
  orcidid: 0000-0002-4479-9008
  surname: Li
  fullname: Li, Boyang
  email: liboyang20@nudt.edu.cn
  organization: College of Electronic Science and Technology, National University of Defense Technology (NUDT), Changsha, China
– sequence: 2
  givenname: Chao
  orcidid: 0000-0002-9666-8894
  surname: Xiao
  fullname: Xiao, Chao
  email: xiaochao12@nudt.edu.cn
  organization: College of Electronic Science and Technology, National University of Defense Technology (NUDT), Changsha, China
– sequence: 3
  givenname: Longguang
  orcidid: 0000-0003-0429-0263
  surname: Wang
  fullname: Wang, Longguang
  email: wanglongguang15@nudt.edu.cn
  organization: College of Electronic Science and Technology, National University of Defense Technology (NUDT), Changsha, China
– sequence: 4
  givenname: Yingqian
  orcidid: 0000-0002-9081-6227
  surname: Wang
  fullname: Wang, Yingqian
  email: wangyingqian16@nudt.edu.cn
  organization: College of Electronic Science and Technology, National University of Defense Technology (NUDT), Changsha, China
– sequence: 5
  givenname: Zaiping
  orcidid: 0009-0007-1000-3060
  surname: Lin
  fullname: Lin, Zaiping
  email: linzaiping@nudt.edu.cn
  organization: College of Electronic Science and Technology, National University of Defense Technology (NUDT), Changsha, China
– sequence: 6
  givenname: Miao
  surname: Li
  fullname: Li, Miao
  email: lm8866@nudt.edu.cn
  organization: College of Electronic Science and Technology, National University of Defense Technology (NUDT), Changsha, China
– sequence: 7
  givenname: Wei
  orcidid: 0000-0001-8319-2105
  surname: An
  fullname: An, Wei
  email: anwei@nudt.edu.cn
  organization: College of Electronic Science and Technology, National University of Defense Technology (NUDT), Changsha, China
– sequence: 8
  givenname: Yulan
  orcidid: 0000-0003-0952-476X
  surname: Guo
  fullname: Guo, Yulan
  email: yulan.guo@nudt.edu.cn
  organization: College of Electronic Science and Technology, National University of Defense Technology (NUDT), Changsha, China
BackLink https://www.ncbi.nlm.nih.gov/pubmed/35994532$$D View this record in MEDLINE/PubMed
BookMark eNp9kctLAzEQxoNU7EPvgiALXrxszSTZR4619VEoKljPS3Z3Ilu3uzXJIv73prR66MHTDMnvS-abb0h6TdsgIedAxwBU3iznL2NGGRtzkBJockQGIAWElArW8z2NkjABIftkaO2KUhARxCekzyMpRcTZgNzOsLEYPKF1WAYT57BxVdv4A_fVmo9AtyaYN9oo469f16qug6Uy7-iCGTostuwpOdaqtni2ryPydn-3nD6Gi-eH-XSyCAueShdyyFErzaGEEkuZRMAZ00zRXFFkHH3VyrsqfcMTlEqjLlVMy1zINE8KPiLXu3c3pv3s_MDZurIF1rVqsO1sxhJvV8hYco9eHaCrtjONn85TaSxT4f176nJPdfkay2xjqrUy39nvdjxAd0BhWmsN6j8EaLYNIPMBZNsAsn0AXhIfSIrKqe2anFFV_Z_wYiesEPHvH5nGAkDyH_bwkXY
CODEN IIPRE4
CitedBy_id crossref_primary_10_1016_j_dsp_2024_104721
crossref_primary_10_1016_j_infrared_2025_105876
crossref_primary_10_1109_TGRS_2025_3593597
crossref_primary_10_3390_s24123885
crossref_primary_10_1016_j_infrared_2024_105475
crossref_primary_10_1016_j_infrared_2025_105752
crossref_primary_10_3389_fphy_2023_1181928
crossref_primary_10_1109_JSTARS_2024_3394953
crossref_primary_10_1109_TGRS_2025_3582578
crossref_primary_10_1109_TGRS_2025_3607732
crossref_primary_10_1109_LSP_2025_3549000
crossref_primary_10_1016_j_eswa_2024_124385
crossref_primary_10_1016_j_infrared_2024_105346
crossref_primary_10_3390_rs17030452
crossref_primary_10_1109_JSEN_2025_3559093
crossref_primary_10_1109_TGRS_2024_3395478
crossref_primary_10_1109_LGRS_2024_3520963
crossref_primary_10_1371_journal_pone_0303451
crossref_primary_10_3390_app14146360
crossref_primary_10_1109_TGRS_2025_3574962
crossref_primary_10_1016_j_eswa_2023_120996
crossref_primary_10_1109_LGRS_2024_3510803
crossref_primary_10_1007_s11227_025_07172_3
crossref_primary_10_1109_TGRS_2024_3522311
crossref_primary_10_1109_TGRS_2024_3386735
crossref_primary_10_1109_JSTARS_2023_3296898
crossref_primary_10_1109_LGRS_2025_3562096
crossref_primary_10_1080_2150704X_2025_2474167
crossref_primary_10_1109_TGRS_2023_3333378
crossref_primary_10_1007_s12555_024_0089_8
crossref_primary_10_1109_TGRS_2023_3347420
crossref_primary_10_3390_rs15184506
crossref_primary_10_1016_j_infrared_2025_105850
crossref_primary_10_1016_j_neucom_2025_129610
crossref_primary_10_3390_electronics12173625
crossref_primary_10_1109_ACCESS_2023_3305942
crossref_primary_10_1080_2150704X_2024_2391082
crossref_primary_10_1109_LGRS_2025_3589265
crossref_primary_10_3390_s25185677
crossref_primary_10_3390_rs17132268
crossref_primary_10_1016_j_optlastec_2025_112851
crossref_primary_10_3390_mi16091043
crossref_primary_10_3390_rs17132264
crossref_primary_10_1016_j_neunet_2023_12_036
crossref_primary_10_1109_TGRS_2025_3550513
crossref_primary_10_3390_rs17030428
crossref_primary_10_1109_TGRS_2023_3323339
crossref_primary_10_1109_LGRS_2023_3308783
crossref_primary_10_1007_s44267_025_00075_0
crossref_primary_10_3390_rs17172963
crossref_primary_10_1109_TIM_2025_3529550
crossref_primary_10_1145_3699758
crossref_primary_10_1109_TGRS_2025_3564958
crossref_primary_10_1016_j_infrared_2024_105674
crossref_primary_10_1051_jnwpu_20244220335
crossref_primary_10_1109_LGRS_2024_3449872
crossref_primary_10_1109_TGRS_2025_3570274
crossref_primary_10_1109_TGRS_2024_3415080
crossref_primary_10_3390_rs16173173
crossref_primary_10_1016_j_infrared_2025_105828
crossref_primary_10_1016_j_optlaseng_2025_109214
crossref_primary_10_1109_LGRS_2025_3542219
crossref_primary_10_3390_rs15204985
crossref_primary_10_1109_TGRS_2025_3550548
crossref_primary_10_1109_TGRS_2024_3401181
crossref_primary_10_1016_j_infrared_2025_105964
crossref_primary_10_1016_j_jag_2025_104645
crossref_primary_10_1109_TGRS_2024_3477575
crossref_primary_10_1016_j_infrared_2025_105727
crossref_primary_10_1016_j_patcog_2025_111894
crossref_primary_10_1109_TGRS_2024_3368059
crossref_primary_10_1007_s12204_024_2694_3
crossref_primary_10_1016_j_engappai_2024_107924
crossref_primary_10_1016_j_optlastec_2025_112835
crossref_primary_10_1051_jnwpu_20254310154
crossref_primary_10_1109_TIM_2025_3551009
crossref_primary_10_3390_s25092771
crossref_primary_10_1142_S0218001425510097
crossref_primary_10_1109_LGRS_2024_3474688
crossref_primary_10_1109_TGRS_2023_3323479
crossref_primary_10_1016_j_neucom_2024_129289
crossref_primary_10_1109_TGRS_2024_3381006
crossref_primary_10_3390_app14135587
crossref_primary_10_1109_TGRS_2025_3585634
crossref_primary_10_1109_JSTARS_2023_3285553
crossref_primary_10_1016_j_infrared_2023_104920
crossref_primary_10_1016_j_infrared_2025_105926
crossref_primary_10_1016_j_infrared_2025_105928
crossref_primary_10_3390_rs17122016
crossref_primary_10_3390_rs15225424
crossref_primary_10_1049_2024_6814362
crossref_primary_10_1016_j_infrared_2025_105825
crossref_primary_10_1109_JSTARS_2024_3434330
crossref_primary_10_1080_09540091_2023_2168254
crossref_primary_10_3390_rs17020307
crossref_primary_10_1016_j_infrared_2023_104927
crossref_primary_10_1109_TGRS_2025_3605402
crossref_primary_10_1109_LGRS_2023_3342981
crossref_primary_10_1109_JSEN_2023_3269085
crossref_primary_10_1109_TGRS_2023_3304755
crossref_primary_10_1109_TGRS_2023_3277848
crossref_primary_10_1109_TGRS_2025_3596902
crossref_primary_10_1109_TGRS_2024_3486751
crossref_primary_10_1109_TIP_2025_3587576
crossref_primary_10_1016_j_engappai_2025_110100
crossref_primary_10_3390_rs17101694
crossref_primary_10_1109_TGRS_2025_3554025
crossref_primary_10_1016_j_infrared_2023_104935
crossref_primary_10_1109_JSTARS_2024_3454150
crossref_primary_10_1109_TGRS_2025_3605399
crossref_primary_10_1109_TGRS_2025_3594718
crossref_primary_10_3390_rs17173070
crossref_primary_10_1007_s00371_024_03727_2
crossref_primary_10_1109_TNNLS_2025_3548984
crossref_primary_10_3390_app13074402
crossref_primary_10_3390_s25030814
crossref_primary_10_3390_app15031649
crossref_primary_10_1007_s11227_025_07057_5
crossref_primary_10_1049_ipr2_13203
crossref_primary_10_1109_LGRS_2025_3563588
crossref_primary_10_1007_s40747_023_01079_3
crossref_primary_10_1109_TGRS_2024_3461795
crossref_primary_10_1016_j_engappai_2025_110244
crossref_primary_10_1109_JSTARS_2024_3472041
crossref_primary_10_1109_TGRS_2025_3588392
crossref_primary_10_1109_JSTARS_2023_3324492
crossref_primary_10_3390_drones7060393
crossref_primary_10_1109_JSTARS_2024_3524551
crossref_primary_10_1109_LGRS_2023_3276326
crossref_primary_10_1016_j_inffus_2025_103338
crossref_primary_10_1109_TGRS_2025_3568425
crossref_primary_10_1016_j_infrared_2023_104975
crossref_primary_10_3390_rs16214030
crossref_primary_10_1109_TPAMI_2025_3544621
crossref_primary_10_1109_JSTARS_2025_3542617
crossref_primary_10_1016_j_media_2025_103610
crossref_primary_10_1109_JSEN_2024_3394956
crossref_primary_10_1109_TGRS_2024_3502663
crossref_primary_10_1109_TGRS_2023_3324821
crossref_primary_10_1016_j_eswa_2025_128776
crossref_primary_10_1088_1361_6501_ad9d66
crossref_primary_10_3390_rs16061038
crossref_primary_10_1109_TGRS_2024_3472455
crossref_primary_10_3390_rs15194716
crossref_primary_10_1109_TGRS_2024_3383649
crossref_primary_10_1016_j_patcog_2025_111942
crossref_primary_10_1016_j_inffus_2025_103007
crossref_primary_10_1109_LGRS_2024_3386529
crossref_primary_10_1016_j_patcog_2025_111706
crossref_primary_10_1007_s12517_024_11857_z
crossref_primary_10_3390_jmse12020345
crossref_primary_10_1007_s00138_024_01554_y
crossref_primary_10_1109_TGRS_2024_3392307
crossref_primary_10_3390_rs16132343
crossref_primary_10_1016_j_infrared_2025_106082
crossref_primary_10_1109_LGRS_2025_3609487
crossref_primary_10_3390_rs16193608
crossref_primary_10_1109_JSTARS_2025_3550581
crossref_primary_10_1016_j_neunet_2024_107026
crossref_primary_10_3390_technologies13010040
crossref_primary_10_3390_rs16224160
crossref_primary_10_3390_rs16040643
crossref_primary_10_1109_LGRS_2025_3597969
crossref_primary_10_1109_TIM_2024_3450094
crossref_primary_10_3847_1538_4365_addbd1
crossref_primary_10_1109_TGRS_2023_3258067
crossref_primary_10_1109_LGRS_2024_3374431
crossref_primary_10_3390_drones8110643
crossref_primary_10_3390_rs16111894
crossref_primary_10_1007_s00371_024_03499_9
crossref_primary_10_1016_j_measurement_2024_115595
crossref_primary_10_1109_TIV_2024_3393015
crossref_primary_10_1016_j_asr_2024_07_076
crossref_primary_10_1016_j_patcog_2024_110312
crossref_primary_10_1109_JSEN_2025_3549519
crossref_primary_10_1109_TMM_2023_3325743
crossref_primary_10_1080_14498596_2022_2140714
crossref_primary_10_1109_JSTARS_2024_3429491
crossref_primary_10_3390_s24237512
crossref_primary_10_1088_1361_6501_addf68
crossref_primary_10_1109_TGRS_2025_3578263
crossref_primary_10_1109_TGRS_2025_3525648
crossref_primary_10_1109_TGRS_2024_3416470
crossref_primary_10_1007_s11082_022_04294_3
crossref_primary_10_1016_j_infrared_2025_106061
crossref_primary_10_3390_rs15051464
crossref_primary_10_1016_j_procs_2025_04_672
crossref_primary_10_1049_ell2_70413
crossref_primary_10_1109_TGRS_2025_3569550
crossref_primary_10_1109_TIP_2023_3326396
crossref_primary_10_1109_TAES_2025_3544613
crossref_primary_10_1109_TGRS_2023_3344584
crossref_primary_10_1109_TGRS_2023_3343496
crossref_primary_10_1016_j_patcog_2025_111956
crossref_primary_10_1016_j_patcog_2025_111958
crossref_primary_10_3390_electronics12122568
crossref_primary_10_1364_AO_558539
crossref_primary_10_1109_TGRS_2025_3526754
crossref_primary_10_1109_JSTARS_2024_3393238
crossref_primary_10_1109_TGRS_2025_3605480
crossref_primary_10_1016_j_rse_2025_114842
crossref_primary_10_1109_TMM_2025_3535366
crossref_primary_10_1109_TGRS_2025_3601625
crossref_primary_10_3390_rs16101758
crossref_primary_10_3390_rs14236044
crossref_primary_10_1109_TGRS_2024_3459652
crossref_primary_10_3390_rs16122218
crossref_primary_10_1016_j_inffus_2025_103600
crossref_primary_10_1109_JSTARS_2024_3386899
crossref_primary_10_1109_TAES_2024_3480890
crossref_primary_10_1109_TGRS_2024_3358831
crossref_primary_10_1109_JSTARS_2024_3508255
crossref_primary_10_1109_TGRS_2023_3286836
crossref_primary_10_1016_j_imavis_2025_105435
crossref_primary_10_1109_TGRS_2024_3450954
crossref_primary_10_3390_rs17152643
crossref_primary_10_1109_TGRS_2024_3503588
crossref_primary_10_3390_s23167240
crossref_primary_10_1016_j_eswa_2025_127029
crossref_primary_10_1109_TAES_2025_3558181
crossref_primary_10_1109_JSTARS_2024_3488698
crossref_primary_10_1088_1361_6501_ad86da
crossref_primary_10_1016_j_asr_2025_08_024
crossref_primary_10_1016_j_infrared_2025_106058
crossref_primary_10_1109_TGRS_2025_3603918
crossref_primary_10_1016_j_neucom_2024_128949
crossref_primary_10_1109_TIP_2023_3345153
crossref_primary_10_1109_TGRS_2024_3516879
crossref_primary_10_1109_TIP_2025_3551533
crossref_primary_10_1109_TGRS_2024_3376382
crossref_primary_10_1080_01431161_2022_2161852
crossref_primary_10_1109_TGRS_2023_3274757
crossref_primary_10_1016_j_infrared_2025_106144
crossref_primary_10_1109_TGRS_2025_3544645
crossref_primary_10_1109_TGRS_2024_3408045
crossref_primary_10_2478_ijanmc_2024_0026
crossref_primary_10_1109_TGRS_2025_3534838
crossref_primary_10_1109_TGRS_2025_3542232
crossref_primary_10_12677_airr_2024_133059
crossref_primary_10_1109_TGRS_2025_3563495
crossref_primary_10_1109_TGRS_2025_3566889
crossref_primary_10_1016_j_energy_2024_131357
crossref_primary_10_1109_JSTARS_2024_3382389
crossref_primary_10_1109_TGRS_2025_3601517
crossref_primary_10_1109_LGRS_2025_3560340
crossref_primary_10_1109_TGRS_2024_3423492
crossref_primary_10_1016_j_optlastec_2024_111867
crossref_primary_10_1109_TGRS_2024_3388261
crossref_primary_10_1109_TMM_2024_3401548
crossref_primary_10_1109_TGRS_2025_3586362
crossref_primary_10_1109_TGRS_2025_3542368
crossref_primary_10_1117_1_JEI_34_2_023018
crossref_primary_10_3390_app13095409
crossref_primary_10_3390_drones8030090
crossref_primary_10_3390_rs17010075
crossref_primary_10_1109_TGRS_2023_3346904
crossref_primary_10_1109_TGRS_2024_3492277
crossref_primary_10_1109_TGRS_2025_3605199
crossref_primary_10_1109_LGRS_2024_3432629
crossref_primary_10_1016_j_engappai_2024_108355
crossref_primary_10_1109_TNNLS_2024_3354982
crossref_primary_10_1109_TGRS_2023_3321723
crossref_primary_10_3390_rs17142502
crossref_primary_10_1109_JSEN_2025_3569157
crossref_primary_10_1109_TGRS_2025_3588117
crossref_primary_10_1109_TIM_2024_3485456
crossref_primary_10_1016_j_eswa_2024_124731
crossref_primary_10_1109_LGRS_2025_3528947
crossref_primary_10_1016_j_optlastec_2023_110012
crossref_primary_10_1109_LGRS_2023_3320191
crossref_primary_10_1109_LGRS_2024_3398581
crossref_primary_10_1109_TGRS_2024_3390756
crossref_primary_10_3390_rs17152672
crossref_primary_10_1016_j_optlastec_2025_113001
crossref_primary_10_1109_TMM_2025_3543003
crossref_primary_10_3390_app15094966
crossref_primary_10_1016_j_knosys_2024_112535
crossref_primary_10_1080_17445302_2024_2365019
crossref_primary_10_1109_JSTARS_2025_3561737
crossref_primary_10_1109_TGRS_2024_3485721
crossref_primary_10_1007_s11227_024_06067_z
crossref_primary_10_1109_TIM_2024_3522435
crossref_primary_10_3390_rs16224301
crossref_primary_10_1007_s40747_024_01410_6
crossref_primary_10_1109_JSTARS_2022_3183230
crossref_primary_10_1109_TGRS_2024_3492256
crossref_primary_10_1109_JSTARS_2024_3452674
crossref_primary_10_3390_app15094634
crossref_primary_10_1109_LGRS_2025_3557021
crossref_primary_10_1007_s00371_024_03615_9
crossref_primary_10_1109_JSEN_2025_3564189
crossref_primary_10_3390_rs17081341
crossref_primary_10_3390_rs17081442
crossref_primary_10_1109_JSTARS_2025_3545014
crossref_primary_10_1109_TNNLS_2023_3331004
crossref_primary_10_1109_TIM_2025_3581659
crossref_primary_10_1109_TGRS_2024_3387125
crossref_primary_10_1016_j_neucom_2024_127685
crossref_primary_10_1109_TGRS_2024_3515648
crossref_primary_10_1016_j_measurement_2025_117890
crossref_primary_10_3390_app12189299
crossref_primary_10_3390_electronics14050858
crossref_primary_10_1016_j_neucom_2025_130229
crossref_primary_10_3390_electronics14173547
crossref_primary_10_1109_TGRS_2024_3412962
crossref_primary_10_1016_j_patcog_2023_109788
crossref_primary_10_1007_s12204_023_2616_9
crossref_primary_10_1109_JSTARS_2024_3398361
crossref_primary_10_1109_TMI_2024_3477555
crossref_primary_10_1007_s12204_025_2823_7
crossref_primary_10_1109_LGRS_2024_3449395
crossref_primary_10_1016_j_optlastec_2025_113691
crossref_primary_10_1016_j_knosys_2025_112963
crossref_primary_10_1109_TIM_2024_3488145
crossref_primary_10_3390_electronics14050947
crossref_primary_10_1016_j_measurement_2024_115941
crossref_primary_10_1109_TGRS_2024_3452175
crossref_primary_10_3390_drones7070424
crossref_primary_10_1088_1742_6596_3004_1_012096
crossref_primary_10_1109_TGRS_2024_3446608
crossref_primary_10_3390_rs17091548
crossref_primary_10_3390_rs17060948
crossref_primary_10_1109_JSTARS_2024_3509993
crossref_primary_10_1109_LSP_2024_3356411
crossref_primary_10_3390_rs15153827
crossref_primary_10_3390_rs17020250
crossref_primary_10_1016_j_eswa_2025_129046
crossref_primary_10_1109_LGRS_2025_3546569
crossref_primary_10_1016_j_dsp_2025_104988
crossref_primary_10_1109_LGRS_2024_3358953
crossref_primary_10_1038_s41598_025_88956_8
crossref_primary_10_1109_LGRS_2025_3547899
crossref_primary_10_1109_JSEN_2025_3546966
crossref_primary_10_1016_j_infrared_2024_105172
crossref_primary_10_3390_electronics13071400
crossref_primary_10_1016_j_optlastec_2025_113557
crossref_primary_10_1109_TAES_2025_3564932
crossref_primary_10_1109_TGRS_2025_3535096
crossref_primary_10_1109_TGRS_2024_3521947
crossref_primary_10_3390_rs17122052
crossref_primary_10_1016_j_isprsjprs_2025_05_005
crossref_primary_10_1109_TGRS_2025_3597777
crossref_primary_10_1007_s11227_025_07149_2
crossref_primary_10_1109_TGRS_2023_3279253
crossref_primary_10_1109_TGRS_2023_3314586
crossref_primary_10_1109_JSTARS_2024_3362397
crossref_primary_10_1109_LGRS_2023_3345666
crossref_primary_10_3390_rs15235539
crossref_primary_10_1088_1361_6501_ad7972
crossref_primary_10_3390_app14104132
crossref_primary_10_11728_cjss2025_01_2024_0025
crossref_primary_10_3390_s24134227
crossref_primary_10_1109_TGRS_2024_3365677
crossref_primary_10_1016_j_neucom_2025_130546
crossref_primary_10_32604_cmc_2024_056075
crossref_primary_10_1109_TGRS_2025_3603784
crossref_primary_10_1109_TGRS_2025_3588885
crossref_primary_10_3390_rs16152746
crossref_primary_10_1109_TGRS_2023_3298192
crossref_primary_10_3390_rs17122072
crossref_primary_10_1016_j_optlastec_2025_113894
crossref_primary_10_1109_TGRS_2025_3602640
crossref_primary_10_1109_TGRS_2024_3504594
crossref_primary_10_1109_TGRS_2025_3589983
crossref_primary_10_3390_rs15153736
crossref_primary_10_1109_TGRS_2025_3589602
crossref_primary_10_1109_TGRS_2025_3588753
crossref_primary_10_1109_TGRS_2023_3309952
crossref_primary_10_3390_rs16010037
crossref_primary_10_3390_rs15204909
crossref_primary_10_1109_LGRS_2024_3431955
crossref_primary_10_1109_TGRS_2024_3488054
crossref_primary_10_3390_electronics12224701
crossref_primary_10_1016_j_knosys_2025_113840
crossref_primary_10_1016_j_patcog_2025_111372
crossref_primary_10_1016_j_optlaseng_2025_108875
crossref_primary_10_1109_TGRS_2023_3327317
crossref_primary_10_3390_rs15153749
crossref_primary_10_1109_TGRS_2024_3471865
crossref_primary_10_1109_JSTARS_2025_3599566
crossref_primary_10_1109_TGRS_2025_3562476
crossref_primary_10_1109_TCSVT_2023_3267127
crossref_primary_10_1109_TGRS_2024_3355947
crossref_primary_10_1109_TITS_2024_3520424
crossref_primary_10_1109_TGRS_2023_3271725
crossref_primary_10_1109_JSTARS_2024_3374054
crossref_primary_10_1371_journal_pone_0324700
crossref_primary_10_3390_rs15081991
crossref_primary_10_3390_rs15153839
crossref_primary_10_1007_s10115_025_02444_z
crossref_primary_10_1109_TGRS_2025_3603991
crossref_primary_10_1109_TGRS_2024_3458896
crossref_primary_10_1016_j_patcog_2025_112127
crossref_primary_10_1109_TGRS_2024_3350024
crossref_primary_10_1109_JSTARS_2025_3596086
crossref_primary_10_1016_j_optlastec_2025_113867
crossref_primary_10_1109_TGRS_2024_3397319
crossref_primary_10_1002_eng2_12714
crossref_primary_10_1109_TIP_2024_3391011
crossref_primary_10_1109_TGRS_2024_3468441
crossref_primary_10_3390_rs16010009
crossref_primary_10_1109_TGRS_2024_3443280
crossref_primary_10_1016_j_neucom_2025_129640
crossref_primary_10_1007_s10489_023_04958_x
crossref_primary_10_1109_TGRS_2024_3470514
crossref_primary_10_1038_s41598_023_48341_9
crossref_primary_10_1109_TIM_2025_3545998
Cites_doi 10.1109/ICASSP40776.2020.9053405
10.1109/TIP.2013.2281420
10.1016/j.patcog.2011.06.009
10.1109/TAES.2020.3024391
10.1109/TGRS.2021.3120411
10.1007/978-3-030-01234-2_1
10.1109/TMI.2019.2959609
10.1016/j.infrared.2017.01.009
10.1109/WACV48630.2021.00099
10.1109/TGRS.2021.3130436
10.1109/TGRS.2019.2942384
10.1016/j.infrared.2012.08.004
10.1117/1.600620
10.1109/LGRS.2019.2954578
10.1109/TGRS.2016.2538295
10.1007/978-3-319-24574-4_28
10.1109/TGRS.2020.3044958
10.1109/ICNNSP.2003.1279357
10.1109/WACV48630.2021.00025
10.1109/LGRS.2014.2323236
10.1117/12.596105
10.3390/rs11040382
10.1007/s13755-022-00204-9
10.1109/LGRS.2020.3004978
10.1109/ICCV.2019.00860
10.1109/CVPR.2016.90
10.1109/LGRS.2022.3140432
10.1109/TGRS.2013.2242477
10.1109/LGRS.2021.3124222
10.1515/9783110584974-025
10.1117/12.364049
10.1109/TGRS.2020.3022069
10.1007/978-3-030-11723-8_27
10.1109/WSSC.2010.5730289
10.1109/JSTARS.2017.2700023
10.3390/rs10111821
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023
DBID 97E
ESBDL
RIA
RIE
AAYXX
CITATION
NPM
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
DOI 10.1109/TIP.2022.3199107
DatabaseName IEEE Xplore (IEEE)
IEEE Xplore Open Access Journals
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
Computer and Information Systems Abstracts
Electronics & Communications Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
Technology Research Database
Computer and Information Systems Abstracts – Academic
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList
PubMed
MEDLINE - Academic
Technology Research Database
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Electronic Library (IEL)
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Applied Sciences
Engineering
EISSN 1941-0042
EndPage 1758
ExternalDocumentID 35994532
10_1109_TIP_2022_3199107
9864119
Genre orig-research
Journal Article
GrantInformation_xml – fundername: National Natural Science Foundation of China
  grantid: 61972435; 61401474; 61921001; 62001478
  funderid: 10.13039/501100001809
GroupedDBID ---
-~X
.DC
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABFSI
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
E.L
EBS
EJD
ESBDL
F5P
HZ~
H~9
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
RIA
RIE
RNS
TAE
TN5
VH1
AAYXX
CITATION
AAYOK
NPM
PKN
RIG
Z5M
7SC
7SP
8FD
JQ2
L7M
L~C
L~D
7X8
ID FETCH-LOGICAL-c389t-31befaf31d1ded9751322f2a0ba0e23e0bafa110d0ba37e9afefda60db498b7c3
IEDL.DBID RIE
ISICitedReferencesCount 434
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000953013100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1057-7149
1941-0042
IngestDate Sat Sep 27 17:59:55 EDT 2025
Mon Jun 30 10:14:54 EDT 2025
Wed Feb 19 02:23:55 EST 2025
Sat Nov 29 03:21:16 EST 2025
Tue Nov 18 21:00:59 EST 2025
Wed Aug 27 02:49:04 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Language English
License https://creativecommons.org/licenses/by/4.0/legalcode
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c389t-31befaf31d1ded9751322f2a0ba0e23e0bafa110d0ba37e9afefda60db498b7c3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0002-9666-8894
0000-0003-0429-0263
0000-0003-0952-476X
0000-0001-8319-2105
0009-0007-1000-3060
0000-0002-4479-9008
0000-0002-9081-6227
OpenAccessLink https://ieeexplore.ieee.org/document/9864119
PMID 35994532
PQID 2786984994
PQPubID 85429
PageCount 14
ParticipantIDs proquest_journals_2786984994
ieee_primary_9864119
pubmed_primary_35994532
crossref_citationtrail_10_1109_TIP_2022_3199107
crossref_primary_10_1109_TIP_2022_3199107
proquest_miscellaneous_2705749693
PublicationCentury 2000
PublicationDate 20230000
2023-00-00
20230101
PublicationDateYYYYMMDD 2023-01-01
PublicationDate_xml – year: 2023
  text: 20230000
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on image processing
PublicationTitleAbbrev TIP
PublicationTitleAlternate IEEE Trans Image Process
PublicationYear 2023
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref35
ref12
ref34
ref15
ref37
ref14
ref36
Redmon (ref23) 2018
ref31
ref30
ref11
ref33
ref10
ref32
ref1
ref17
ref16
ref38
ref19
ref18
Duchi (ref41) 2011; 12
Ying (ref2) 2022
ref24
ref26
ref25
ref20
Glorot (ref42)
ref21
Paszke (ref43)
Hui (ref39) 2020; 5
ref28
ref27
ref29
ref8
ref7
ref9
ref4
ref3
ref6
ref5
ref40
Ren (ref22) 2015
References_xml – ident: ref28
  doi: 10.1109/ICASSP40776.2020.9053405
– ident: ref7
  doi: 10.1109/TIP.2013.2281420
– ident: ref14
  doi: 10.1016/j.patcog.2011.06.009
– ident: ref21
  doi: 10.1109/TAES.2020.3024391
– ident: ref36
  doi: 10.1109/TGRS.2021.3120411
– ident: ref30
  doi: 10.1007/978-3-030-01234-2_1
– ident: ref29
  doi: 10.1109/TMI.2019.2959609
– ident: ref19
  doi: 10.1016/j.infrared.2017.01.009
– ident: ref24
  doi: 10.1109/WACV48630.2021.00099
– ident: ref38
  doi: 10.1109/TGRS.2021.3130436
– ident: ref18
  doi: 10.1109/TGRS.2019.2942384
– ident: ref15
  doi: 10.1016/j.infrared.2012.08.004
– start-page: 8026
  volume-title: Proc. Adv. Neural Inf. Process. Syst. (NeurIPS)
  ident: ref43
  article-title: PyTorch: An imperative style, high-performance deep learning library
– ident: ref6
  doi: 10.1117/1.600620
– ident: ref12
  doi: 10.1109/LGRS.2019.2954578
– volume: 12
  start-page: 2121
  issue: 7
  year: 2011
  ident: ref41
  article-title: Adaptive subgradient methods for online learning and stochastic optimization
  publication-title: J. Mach. Learn. Res.
– ident: ref3
  doi: 10.1109/TGRS.2016.2538295
– start-page: 249
  volume-title: Proc. 13th Int. Conf. Artif. Intell. Statist.
  ident: ref42
  article-title: Understanding the difficulty of training deep feedforward neural networks
– ident: ref25
  doi: 10.1007/978-3-319-24574-4_28
– ident: ref31
  doi: 10.1109/TGRS.2020.3044958
– ident: ref33
  doi: 10.1109/ICNNSP.2003.1279357
– ident: ref35
  doi: 10.1109/WACV48630.2021.00025
– ident: ref11
  doi: 10.1109/LGRS.2014.2323236
– ident: ref34
  doi: 10.1117/12.596105
– ident: ref17
  doi: 10.3390/rs11040382
– ident: ref26
  doi: 10.1007/s13755-022-00204-9
– ident: ref13
  doi: 10.1109/LGRS.2020.3004978
– ident: ref32
  doi: 10.1109/ICCV.2019.00860
– year: 2015
  ident: ref22
  article-title: Faster R-CNN: Towards real-time object detection with region proposal networks
  publication-title: arXiv:1506.01497
– ident: ref40
  doi: 10.1109/CVPR.2016.90
– ident: ref4
  doi: 10.1109/LGRS.2022.3140432
– ident: ref10
  doi: 10.1109/TGRS.2013.2242477
– ident: ref37
  doi: 10.1109/LGRS.2021.3124222
– ident: ref20
  doi: 10.1515/9783110584974-025
– ident: ref9
  doi: 10.1117/12.364049
– ident: ref5
  doi: 10.1109/TGRS.2020.3022069
– ident: ref27
  doi: 10.1007/978-3-030-11723-8_27
– volume: 5
  start-page: 291
  issue: 3
  year: 2020
  ident: ref39
  article-title: A dataset for infrared detection and tracking of dim-small aircraft targets under ground/air background
  publication-title: China Sci. Data
– year: 2022
  ident: ref2
  article-title: MoCoPnet: Exploring local motion and contrast priors for infrared small target super-resolution
  publication-title: arXiv:2201.01014
– ident: ref1
  doi: 10.1109/WSSC.2010.5730289
– ident: ref8
  doi: 10.1109/JSTARS.2017.2700023
– ident: ref16
  doi: 10.3390/rs10111821
– year: 2018
  ident: ref23
  article-title: YOLOv3: An incremental improvement
  publication-title: arXiv:1804.02767
SSID ssj0014516
Score 2.7473917
Snippet Single-frame infrared small target (SIRST) detection aims at separating small targets from clutter backgrounds. With the advances of deep learning, CNN-based...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1745
SubjectTerms Annotations
channel and spatial attention
Clutter
dataset
Datasets
Decoding
deep learning
dense nested interactive module
False alarms
Feature extraction
Infrared small target detection
Modules
Object detection
Object recognition
Performance evaluation
Shape
Target detection
Training
Title Dense Nested Attention Network for Infrared Small Target Detection
URI https://ieeexplore.ieee.org/document/9864119
https://www.ncbi.nlm.nih.gov/pubmed/35994532
https://www.proquest.com/docview/2786984994
https://www.proquest.com/docview/2705749693
Volume 32
WOSCitedRecordID wos000953013100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Electronic Library (IEL)
  customDbUrl:
  eissn: 1941-0042
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014516
  issn: 1057-7149
  databaseCode: RIE
  dateStart: 19920101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3db9MwED-ViofxQKHdIFAmI_GCtKxp7NTxY2FUVEJVJcrUt8iJzxJSSVGb7u_f2fnQkAYST7GSS2Ldne3f-c53AB-oc1YII8ICORko3KpQF7kJ0ZgpNylH4bPz336Tq1W63ap1D666szCI6IPP8No1vS_f7IuT2yqbuFTiU5fj84mUsj6r1XkMXMFZ79lMZCgJ9rcuyUhNNss1GYJxTPYpoaHI1d3jiVIi4fEfq5Evr_J3pOlXnMXg__r6Ap43yJLNa1V4CT0shzBoUCZrxvBxCM8epCAcwacbsmORrfy2J5tXVR3-SDd8eDgjTMuWpT24OHX2_Zfe7djGB4-zG6x8GFd5Dj8WXzafv4ZNXYWwIHhS0bSbo9WWT83UoFEycRapjXWU6whjjnS1mjhnqMElKm3RGj2LTC5UmsuCX0C_3Jf4Gph2UwbRajTuDJfwwuU6NWpWREmCAUxa_mZFk3Tc1b7YZd74iFRGwsmccLJGOAF87N74XSfc-AftyDG-o2t4HsC4FWHWjMhjFst0plKy70QA77vHNJacg0SXuD85GtIeoWaKB_CqFn337VZj3jz-z7dw5grR15szY-hXhxO-g6fFXfXzeLgkhd2ml15h7wF3xuUC
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1RT9swED4hNgl4WDdgkMHAk3hBIjSJnSZ-ZHQV1boKiYJ4i5z4LE0qKWpTfv_OThqBxJB4ipVcEuvubN_nO98BnFDnjBBa-AVyAijcSF8VufZR65DrlKNw2fnvRsl4nN7fy-s1OGvPwiCiCz7Dc9t0vnw9K5Z2q6xrU4mHNsfnh1iIKKxPa7U-A1ty1vk248RPyPBfOSUD2Z0MrwkKRhEhVLKHAlt5j8dSiphHL9YjV2Dl_7amW3MGnff19jN8amxLdlErwxdYw3IbOo2dyZpRvNiGrWdJCHfgZ5-QLLKx2_hkF1VVB0DSDRcgzsiqZcPSzG2kOrt5UNMpm7jwcdbHygVylbtwO_g1ubzym8oKfkEGSkUTb45GGR7qUKOWSWwxqYlUkKsAI450NYo4p6nBE5TKoNGqF-hcyDRPCv4V1stZifvAlJ00iFahtqe4hBMvV6mWvSKIY_Sgu-JvVjRpx231i2nm4EcgMxJOZoWTNcLx4LR947FOufEG7Y5lfEvX8NyDw5UIs2ZMLrIoSXsyJYQnPPjRPqbRZF0kqsTZ0tKQ9gjZk9yDvVr07bdXGvPt9X8ew8bV5M8oGw3Hvw9g05alr7dqDmG9mi_xO3wsnqq_i_mRU9t_I4XnYQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Dense+Nested+Attention+Network+for+Infrared+Small+Target+Detection&rft.jtitle=IEEE+transactions+on+image+processing&rft.au=Li%2C+Boyang&rft.au=Xiao%2C+Chao&rft.au=Wang%2C+Longguang&rft.au=Wang%2C+Yingqian&rft.date=2023&rft.issn=1057-7149&rft.eissn=1941-0042&rft.volume=32&rft.spage=1745&rft.epage=1758&rft_id=info:doi/10.1109%2FTIP.2022.3199107&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TIP_2022_3199107
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1057-7149&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1057-7149&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1057-7149&client=summon