UIU-Net: U-Net in U-Net for Infrared Small Object Detection
Learning-based infrared small object detection methods currently rely heavily on the classification backbone network. This tends to result in tiny object loss and feature distinguishability limitations as the network depth increases. Furthermore, small objects in infrared images are frequently emerg...
Gespeichert in:
| Veröffentlicht in: | IEEE transactions on image processing Jg. 32; S. 1 |
|---|---|
| Hauptverfasser: | , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
United States
IEEE
01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Institute of Electrical and Electronics Engineers |
| Schlagworte: | |
| ISSN: | 1057-7149, 1941-0042, 1941-0042 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | Learning-based infrared small object detection methods currently rely heavily on the classification backbone network. This tends to result in tiny object loss and feature distinguishability limitations as the network depth increases. Furthermore, small objects in infrared images are frequently emerged bright and dark, posing severe demands for obtaining precise object contrast information. For this reason, we in this paper propose a simple and effective "U-Net in U-Net" framework, UIU-Net for short, and detect small objects in infrared images. As the name suggests, UIU-Net embeds a tiny U-Net into a larger U-Net backbone, enabling the multi-level and multi-scale representation learning of objects. Moreover, UIU-Net can be trained from scratch, and the learned features can enhance global and local contrast information effectively. More specifically, the UIU-Net model is divided into two modules: the resolution-maintenance deep supervision (RM-DS) module and the interactive-cross attention (IC-A) module. RM-DS integrates Residual U-blocks into a deep supervision network to generate deep multi-scale resolution-maintenance features while learning global context information. Further, IC-A encodes the local context information between the low-level details and high-level semantic features. Extensive experiments conducted on two infrared single-frame image datasets, i.e., SIRST and Synthetic datasets, show the effectiveness and superiority of the proposed UIU-Net in comparison with several state-of-the-art infrared small object detection methods. The proposed UIU-Net also produces powerful generalization performance for video sequence infrared small object datasets, e.g., ATR ground/air video sequence dataset. The codes of this work are available openly at https://github.com/danfenghong/IEEE_TIP_UIU-Net. |
|---|---|
| AbstractList | Learning-based infrared small object detection methods currently rely heavily on the classification backbone network. This tends to result in tiny object loss and feature distinguishability limitations as the network depth increases. Furthermore, small objects in infrared images are frequently emerged bright and dark, posing severe demands for obtaining precise object contrast information. For this reason, we in this paper propose a simple and effective "U-Net in U-Net" framework, UIU-Net for short, and detect small objects in infrared images. As the name suggests, UIU-Net embeds a tiny U-Net into a larger U-Net backbone, enabling the multi-level and multi-scale representation learning of objects. Moreover, UIU-Net can be trained from scratch, and the learned features can enhance global and local contrast information effectively. More specifically, the UIU-Net model is divided into two modules: the resolution-maintenance deep supervision (RM-DS) module and the interactive-cross attention (IC-A) module. RM-DS integrates Residual U-blocks into a deep supervision network to generate deep multi-scale resolution-maintenance features while learning global context information. Further, IC-A encodes the local context information between the low-level details and high-level semantic features. Extensive experiments conducted on two infrared single-frame image datasets, i.e., SIRST and Synthetic datasets, show the effectiveness and superiority of the proposed UIU-Net in comparison with several state-of-the-art infrared small object detection methods. The proposed UIU-Net also produces powerful generalization performance for video sequence infrared small object datasets, e.g., ATR ground/air video sequence dataset. The codes of this work are available openly at https://github.com/danfenghong/IEEE. Learning-based infrared small object detection methods currently rely heavily on the classification backbone network. This tends to result in tiny object loss and feature distinguishability limitations as the network depth increases. Furthermore, small objects in infrared images are frequently emerged bright and dark, posing severe demands for obtaining precise object contrast information. For this reason, we in this paper propose a simple and effective “U-Net in U-Net” framework, UIU-Net for short, and detect small objects in infrared images. As the name suggests, UIU-Net embeds a tiny U-Net into a larger U-Net backbone, enabling the multi-level and multi-scale representation learning of objects. Moreover, UIU-Net can be trained from scratch, and the learned features can enhance global and local contrast information effectively. More specifically, the UIU-Net model is divided into two modules: the resolution-maintenance deep supervision (RM-DS) module and the interactive-cross attention (IC-A) module. RM-DS integrates Residual U-blocks into a deep supervision network to generate deep multi-scale resolution-maintenance features while learning global context information. Further, IC-A encodes the local context information between the low-level details and high-level semantic features. Extensive experiments conducted on two infrared single-frame image datasets, i.e., SIRST and Synthetic datasets, show the effectiveness and superiority of the proposed UIU-Net in comparison with several state-of-the-art infrared small object detection methods. The proposed UIU-Net also produces powerful generalization performance for video sequence infrared small object datasets, e.g., ATR ground/air video sequence dataset. The codes of this work are available openly at https://github.com/danfenghong/IEEE Learning-based infrared small object detection methods currently rely heavily on the classification backbone network. This tends to result in tiny object loss and feature distinguishability limitations as the network depth increases. Furthermore, small objects in infrared images are frequently emerged bright and dark, posing severe demands for obtaining precise object contrast information. For this reason, we in this paper propose a simple and effective "U-Net in U-Net" framework, UIU-Net for short, and detect small objects in infrared images. As the name suggests, UIU-Net embeds a tiny U-Net into a larger U-Net backbone, enabling the multi-level and multi-scale representation learning of objects. Moreover, UIU-Net can be trained from scratch, and the learned features can enhance global and local contrast information effectively. More specifically, the UIU-Net model is divided into two modules: the resolution-maintenance deep supervision (RM-DS) module and the interactive-cross attention (IC-A) module. RM-DS integrates Residual U-blocks into a deep supervision network to generate deep multi-scale resolution-maintenance features while learning global context information. Further, IC-A encodes the local context information between the low-level details and high-level semantic features. Extensive experiments conducted on two infrared single-frame image datasets, i.e., SIRST and Synthetic datasets, show the effectiveness and superiority of the proposed UIU-Net in comparison with several state-of-the-art infrared small object detection methods. The proposed UIU-Net also produces powerful generalization performance for video sequence infrared small object datasets, e.g., ATR ground/air video sequence dataset. The codes of this work are available openly at https://github.com/danfenghong/IEEE.Learning-based infrared small object detection methods currently rely heavily on the classification backbone network. This tends to result in tiny object loss and feature distinguishability limitations as the network depth increases. Furthermore, small objects in infrared images are frequently emerged bright and dark, posing severe demands for obtaining precise object contrast information. For this reason, we in this paper propose a simple and effective "U-Net in U-Net" framework, UIU-Net for short, and detect small objects in infrared images. As the name suggests, UIU-Net embeds a tiny U-Net into a larger U-Net backbone, enabling the multi-level and multi-scale representation learning of objects. Moreover, UIU-Net can be trained from scratch, and the learned features can enhance global and local contrast information effectively. More specifically, the UIU-Net model is divided into two modules: the resolution-maintenance deep supervision (RM-DS) module and the interactive-cross attention (IC-A) module. RM-DS integrates Residual U-blocks into a deep supervision network to generate deep multi-scale resolution-maintenance features while learning global context information. Further, IC-A encodes the local context information between the low-level details and high-level semantic features. Extensive experiments conducted on two infrared single-frame image datasets, i.e., SIRST and Synthetic datasets, show the effectiveness and superiority of the proposed UIU-Net in comparison with several state-of-the-art infrared small object detection methods. The proposed UIU-Net also produces powerful generalization performance for video sequence infrared small object datasets, e.g., ATR ground/air video sequence dataset. The codes of this work are available openly at https://github.com/danfenghong/IEEE. Learning-based infrared small object detection methods currently rely heavily on the classification backbone network. This tends to result in tiny object loss and feature distinguishability limitations as the network depth increases. Furthermore, small objects in infrared images are frequently emerged bright and dark, posing severe demands for obtaining precise object contrast information. For this reason, we in this paper propose a simple and effective "U-Net in U-Net" framework, UIU-Net for short, and detect small objects in infrared images. As the name suggests, UIU-Net embeds a tiny U-Net into a larger U-Net backbone, enabling the multi-level and multi-scale representation learning of objects. Moreover, UIU-Net can be trained from scratch, and the learned features can enhance global and local contrast information effectively. More specifically, the UIU-Net model is divided into two modules: the resolution-maintenance deep supervision (RM-DS) module and the interactive-cross attention (IC-A) module. RM-DS integrates Residual U-blocks into a deep supervision network to generate deep multi-scale resolution-maintenance features while learning global context information. Further, IC-A encodes the local context information between the low-level details and high-level semantic features. Extensive experiments conducted on two infrared single-frame image datasets, i.e., SIRST and Synthetic datasets, show the effectiveness and superiority of the proposed UIU-Net in comparison with several state-of-the-art infrared small object detection methods. The proposed UIU-Net also produces powerful generalization performance for video sequence infrared small object datasets, e.g., ATR ground/air video sequence dataset. The codes of this work are available openly at https://github.com/danfenghong/IEEE_TIP_UIU-Net. |
| Author | Chanussot, Jocelyn Wu, Xin Hong, Danfeng |
| Author_xml | – sequence: 1 givenname: Xin surname: Wu fullname: Wu, Xin organization: School of Computer Science (National Pilot Software Engineering School), Beijing University of Posts and Telecommunications, Beijing, China – sequence: 2 givenname: Danfeng orcidid: 0000-0002-3212-9584 surname: Hong fullname: Hong, Danfeng organization: Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing, China – sequence: 3 givenname: Jocelyn orcidid: 0000-0003-4817-2875 surname: Chanussot fullname: Chanussot, Jocelyn organization: Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing, China |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37015404$$D View this record in MEDLINE/PubMed https://hal.science/hal-04473605$$DView record in HAL |
| BookMark | eNp9kc1Lw0AQxRdRbP24C4IEvOghdWZ3s5vVk_jVQlHB9rxsthtMSRPdpIL_vRtbe-jB0xuG35th5h2Q3aquHCEnCANEUFeT0euAAqUDRmnKldwhfVQcYwBOd0MNiYwlctUjB00zB0CeoNgnPSYBEw68T26mo2n87Nrr6FeioloXee2jUZV7490seluYsoxesrmzbXTv2iBFXR2RvdyUjTte6yGZPj5M7obx-OVpdHc7ji1LsY3TLDNScAU5ppIZRpmQM8ZlikpmnOfAjWTc5MzYLMmRSiPQWEmZlVlGLbJDcrma-25K_eGLhfHfujaFHt6OddcDziUTkHx17MWK_fD159I1rV4UjXVlaSpXLxtNpRIoEBUN6PkWOq-XvgqXBCoRiQBJVaDO1tQyW7jZZv_fCwMgVoD1ddN4l2tbtKb7T-tNUWoE3WWlQ1a6y0qvswpG2DL-zf7HcrqyFM65Da5Uqjhj7AeYUZiV |
| CODEN | IIPRE4 |
| CitedBy_id | crossref_primary_10_1109_TCYB_2024_3515104 crossref_primary_10_1016_j_eswa_2024_125476 crossref_primary_10_1109_TAP_2025_3577780 crossref_primary_10_1109_TIM_2024_3379412 crossref_primary_10_1007_s11227_025_06987_4 crossref_primary_10_3390_jimaging9100226 crossref_primary_10_1109_JSTARS_2023_3321714 crossref_primary_10_1109_TGRS_2025_3540945 crossref_primary_10_3390_s24123885 crossref_primary_10_1016_j_infrared_2025_105752 crossref_primary_10_1049_ipr2_70053 crossref_primary_10_1109_TCYB_2024_3518975 crossref_primary_10_1016_j_eswa_2023_122917 crossref_primary_10_1016_j_inffus_2025_103042 crossref_primary_10_1109_TGRS_2024_3362895 crossref_primary_10_1109_TGRS_2025_3582578 crossref_primary_10_1109_TGRS_2024_3392188 crossref_primary_10_1109_TIP_2024_3485518 crossref_primary_10_1109_TGRS_2025_3607732 crossref_primary_10_1109_LSP_2025_3549000 crossref_primary_10_1049_ipr2_13158 crossref_primary_10_1088_1361_6501_adf2d0 crossref_primary_10_1016_j_inffus_2025_103035 crossref_primary_10_3390_rs17030452 crossref_primary_10_1109_JSEN_2025_3559093 crossref_primary_10_1109_TGRS_2024_3395478 crossref_primary_10_1016_j_asr_2025_04_012 crossref_primary_10_3390_s24237767 crossref_primary_10_1109_TGRS_2023_3307052 crossref_primary_10_1109_TNNLS_2024_3401589 crossref_primary_10_1109_TGRS_2025_3574962 crossref_primary_10_1109_TGRS_2025_3574963 crossref_primary_10_1109_TGRS_2025_3589097 crossref_primary_10_1016_j_neucom_2025_130725 crossref_primary_10_1109_LGRS_2025_3562096 crossref_primary_10_1109_TGRS_2024_3350573 crossref_primary_10_1080_2150704X_2025_2474167 crossref_primary_10_1109_TGRS_2024_3362680 crossref_primary_10_1109_JSTARS_2023_3284667 crossref_primary_10_1016_j_infrared_2025_105851 crossref_primary_10_3389_fnbot_2023_1285673 crossref_primary_10_3390_rs15184506 crossref_primary_10_1016_j_eswa_2023_121811 crossref_primary_10_1016_j_neucom_2025_129610 crossref_primary_10_1109_JESTIE_2023_3322111 crossref_primary_10_1109_LGRS_2023_3345946 crossref_primary_10_1109_TGRS_2025_3581342 crossref_primary_10_3390_app15084121 crossref_primary_10_1080_2150704X_2024_2391082 crossref_primary_10_1109_TGRS_2025_3562966 crossref_primary_10_1109_JSTARS_2025_3528057 crossref_primary_10_3390_s25185677 crossref_primary_10_1109_TGRS_2024_3365833 crossref_primary_10_3390_rs17132268 crossref_primary_10_1016_j_optlastec_2025_112851 crossref_primary_10_3390_mi16091043 crossref_primary_10_3390_rs17132264 crossref_primary_10_1016_j_patcog_2023_109911 crossref_primary_10_1016_j_neunet_2023_12_036 crossref_primary_10_1016_j_neunet_2023_12_034 crossref_primary_10_1109_TGRS_2023_3328908 crossref_primary_10_3390_rs17030428 crossref_primary_10_1007_s44267_025_00075_0 crossref_primary_10_1016_j_infrared_2024_105314 crossref_primary_10_1109_TGRS_2025_3537331 crossref_primary_10_3390_rs17172963 crossref_primary_10_1109_TGRS_2025_3594008 crossref_primary_10_1109_TIP_2025_3602739 crossref_primary_10_3390_rs16111912 crossref_primary_10_1109_JSEN_2025_3543839 crossref_primary_10_3390_rs15123018 crossref_primary_10_1109_TGRS_2025_3570274 crossref_primary_10_1109_TGRS_2024_3415080 crossref_primary_10_1016_j_optlaseng_2025_109214 crossref_primary_10_1109_JSTARS_2024_3518781 crossref_primary_10_1109_TGRS_2024_3386703 crossref_primary_10_1109_TGRS_2025_3550548 crossref_primary_10_1109_JIOT_2025_3527021 crossref_primary_10_1016_j_jag_2025_104645 crossref_primary_10_1109_TGRS_2024_3477575 crossref_primary_10_1016_j_infrared_2025_105727 crossref_primary_10_1016_j_patcog_2025_111894 crossref_primary_10_1109_JSTARS_2023_3299730 crossref_primary_10_1007_s12204_024_2694_3 crossref_primary_10_1016_j_optlastec_2025_112835 crossref_primary_10_1016_j_jsg_2025_105426 crossref_primary_10_3390_min14101012 crossref_primary_10_1109_ACCESS_2024_3365356 crossref_primary_10_1109_ACCESS_2025_3548128 crossref_primary_10_1109_LGRS_2024_3474688 crossref_primary_10_1109_TGRS_2023_3323479 crossref_primary_10_1016_j_neucom_2024_129289 crossref_primary_10_1109_TASE_2025_3532674 crossref_primary_10_1007_s44443_025_00023_4 crossref_primary_10_3390_s23073574 crossref_primary_10_1109_JSTARS_2024_3376070 crossref_primary_10_1109_JSTARS_2024_3468456 crossref_primary_10_1109_JSEN_2025_3569602 crossref_primary_10_1016_j_knosys_2023_110710 crossref_primary_10_1016_j_ijthermalsci_2025_110147 crossref_primary_10_1016_j_neunet_2025_107162 crossref_primary_10_1016_j_eswa_2023_120519 crossref_primary_10_1109_TGRS_2023_3345159 crossref_primary_10_1016_j_infrared_2025_105926 crossref_primary_10_3390_rs17122016 crossref_primary_10_1007_s10489_025_06689_7 crossref_primary_10_1007_s10845_025_02588_3 crossref_primary_10_3390_app13031692 crossref_primary_10_1016_j_sigpro_2024_109805 crossref_primary_10_1007_s10278_025_01460_3 crossref_primary_10_1016_j_infrared_2025_105825 crossref_primary_10_1109_TGRS_2023_3279834 crossref_primary_10_1109_JSTARS_2024_3381779 crossref_primary_10_1109_TAP_2025_3533739 crossref_primary_10_1109_TGRS_2025_3605402 crossref_primary_10_1109_LGRS_2023_3342981 crossref_primary_10_1109_TGRS_2023_3289878 crossref_primary_10_1016_j_infrared_2024_105631 crossref_primary_10_1080_01431161_2023_2221800 crossref_primary_10_1109_JSTARS_2023_3335288 crossref_primary_10_1109_TGRS_2025_3596902 crossref_primary_10_1016_j_isprsjprs_2023_05_026 crossref_primary_10_1109_TIP_2025_3587576 crossref_primary_10_1109_JSEN_2023_3347584 crossref_primary_10_1016_j_engappai_2025_110100 crossref_primary_10_1109_JSTARS_2023_3325365 crossref_primary_10_1109_TGRS_2025_3554025 crossref_primary_10_1016_j_infrared_2023_104935 crossref_primary_10_1109_TGRS_2025_3605399 crossref_primary_10_1109_TNNLS_2023_3279931 crossref_primary_10_1109_TGRS_2025_3604069 crossref_primary_10_1016_j_srs_2024_100190 crossref_primary_10_1109_TGRS_2023_3330490 crossref_primary_10_1109_TGRS_2025_3594718 crossref_primary_10_1109_TGRS_2024_3519195 crossref_primary_10_1016_j_knosys_2025_113003 crossref_primary_10_1038_s41598_024_61136_w crossref_primary_10_1016_j_dsp_2025_105151 crossref_primary_10_1109_TGRS_2024_3502401 crossref_primary_10_1007_s00371_024_03727_2 crossref_primary_10_1109_TNNLS_2025_3548984 crossref_primary_10_1007_s10043_025_00977_w crossref_primary_10_1109_JSEN_2025_3561200 crossref_primary_10_3390_s25030814 crossref_primary_10_1109_ACCESS_2024_3395499 crossref_primary_10_1109_JSEN_2024_3407132 crossref_primary_10_1016_j_inffus_2025_103105 crossref_primary_10_1016_j_cja_2025_103781 crossref_primary_10_1109_LGRS_2025_3563588 crossref_primary_10_3390_rs15051259 crossref_primary_10_3389_fpls_2023_1224884 crossref_primary_10_1016_j_engappai_2025_110244 crossref_primary_10_3389_fpls_2024_1458978 crossref_primary_10_1109_JSTARS_2024_3472041 crossref_primary_10_1109_JSTARS_2023_3280905 crossref_primary_10_1038_s41598_025_16878_6 crossref_primary_10_1109_JSEN_2023_3343080 crossref_primary_10_1109_TGRS_2025_3588392 crossref_primary_10_1109_ACCESS_2023_3267435 crossref_primary_10_1109_JSTARS_2023_3324492 crossref_primary_10_1016_j_knosys_2025_114122 crossref_primary_10_1109_JSEN_2025_3567354 crossref_primary_10_1109_JSTARS_2024_3524551 crossref_primary_10_3934_acse_2025018 crossref_primary_10_1016_j_inffus_2025_103338 crossref_primary_10_1109_TGRS_2023_3295386 crossref_primary_10_1109_TGRS_2025_3568425 crossref_primary_10_1016_j_dsp_2025_105045 crossref_primary_10_1016_j_aiia_2025_03_001 crossref_primary_10_1016_j_infrared_2023_104975 crossref_primary_10_1016_j_media_2025_103610 crossref_primary_10_1109_JSEN_2024_3394956 crossref_primary_10_1109_JSTARS_2024_3394887 crossref_primary_10_1109_TGRS_2024_3502663 crossref_primary_10_1109_TGRS_2023_3324821 crossref_primary_10_3390_s25061793 crossref_primary_10_1016_j_eswa_2025_128776 crossref_primary_10_1016_j_ibmed_2025_100216 crossref_primary_10_1109_TGRS_2024_3472455 crossref_primary_10_1109_TGRS_2023_3304836 crossref_primary_10_1109_TGRS_2023_3324947 crossref_primary_10_1109_TGRS_2024_3383649 crossref_primary_10_3390_s23198118 crossref_primary_10_1016_j_inffus_2025_103007 crossref_primary_10_1016_j_patcog_2024_110330 crossref_primary_10_1016_j_knosys_2023_111306 crossref_primary_10_1016_j_patcog_2025_111706 crossref_primary_10_1007_s12517_024_11857_z crossref_primary_10_1016_j_infrared_2023_104983 crossref_primary_10_1007_s00138_024_01554_y crossref_primary_10_1109_TGRS_2024_3486559 crossref_primary_10_1007_s11517_024_03025_y crossref_primary_10_1016_j_infrared_2025_106082 crossref_primary_10_1109_LGRS_2025_3609487 crossref_primary_10_1016_j_measurement_2025_116971 crossref_primary_10_1109_JSTARS_2025_3550581 crossref_primary_10_1109_LSP_2025_3582672 crossref_primary_10_1109_TGRS_2024_3381774 crossref_primary_10_1109_TGRS_2024_3520161 crossref_primary_10_1016_j_eswa_2023_120143 crossref_primary_10_3390_rs16224160 crossref_primary_10_1109_TGRS_2024_3415002 crossref_primary_10_1109_LGRS_2025_3597969 crossref_primary_10_1109_TGRS_2023_3258061 crossref_primary_10_1080_22797254_2023_2277213 crossref_primary_10_1109_TGRS_2025_3561850 crossref_primary_10_1016_j_displa_2024_102681 crossref_primary_10_1016_j_jag_2024_104262 crossref_primary_10_3390_s23198101 crossref_primary_10_1109_LGRS_2024_3374431 crossref_primary_10_1016_j_imavis_2024_105101 crossref_primary_10_1109_TGRS_2023_3323519 crossref_primary_10_1109_JSTARS_2025_3564847 crossref_primary_10_1109_JSTARS_2023_3244616 crossref_primary_10_3390_drones8110643 crossref_primary_10_3390_rs16111894 crossref_primary_10_1016_j_neunet_2023_09_044 crossref_primary_10_26599_BDMA_2023_9020036 crossref_primary_10_1016_j_knosys_2025_113282 crossref_primary_10_1109_JSTARS_2024_3349541 crossref_primary_10_1109_TIV_2024_3393015 crossref_primary_10_3390_rs16214018 crossref_primary_10_1109_TGRS_2025_3603167 crossref_primary_10_1016_j_patcog_2024_110312 crossref_primary_10_1016_j_patcog_2024_110675 crossref_primary_10_1109_JSEN_2025_3549519 crossref_primary_10_1109_TGRS_2024_3409612 crossref_primary_10_1109_TMM_2023_3325743 crossref_primary_10_3390_rs15225380 crossref_primary_10_1016_j_knosys_2023_110799 crossref_primary_10_1109_JSTARS_2024_3429491 crossref_primary_10_1016_j_neunet_2023_08_008 crossref_primary_10_1109_TGRS_2025_3578263 crossref_primary_10_1109_TGRS_2025_3525648 crossref_primary_10_1109_TGRS_2024_3416470 crossref_primary_10_1016_j_infrared_2025_106061 crossref_primary_10_1016_j_heliyon_2024_e33892 crossref_primary_10_1016_j_imavis_2025_105651 crossref_primary_10_1109_TGRS_2025_3569550 crossref_primary_10_1016_j_autcon_2025_106368 crossref_primary_10_3390_rs16061001 crossref_primary_10_1109_TGRS_2024_3452550 crossref_primary_10_1016_j_bspc_2025_108440 crossref_primary_10_1109_TAES_2025_3544613 crossref_primary_10_1016_j_dsp_2025_105121 crossref_primary_10_1088_1361_6501_adbe96 crossref_primary_10_1016_j_inffus_2025_103374 crossref_primary_10_1016_j_patcog_2025_111958 crossref_primary_10_1080_10095020_2024_2378920 crossref_primary_10_1155_2023_2520933 crossref_primary_10_1109_JSTARS_2024_3393238 crossref_primary_10_1109_TGRS_2025_3605480 crossref_primary_10_1109_TGRS_2024_3379355 crossref_primary_10_1109_TGRS_2024_3392794 crossref_primary_10_1109_TGRS_2024_3521483 crossref_primary_10_1109_TCYB_2024_3410844 crossref_primary_10_1016_j_inffus_2025_103600 crossref_primary_10_1109_JSTARS_2024_3386899 crossref_primary_10_1109_TIP_2024_3374225 crossref_primary_10_1109_TAES_2024_3480890 crossref_primary_10_1016_j_inffus_2023_102192 crossref_primary_10_1109_JSTARS_2024_3508255 crossref_primary_10_1109_JSTARS_2025_3585640 crossref_primary_10_1109_TGRS_2025_3555637 crossref_primary_10_1016_j_imavis_2025_105435 crossref_primary_10_1016_j_sigpro_2023_109151 crossref_primary_10_1016_j_sigpro_2023_109272 crossref_primary_10_1109_TGRS_2025_3578632 crossref_primary_10_1109_TGRS_2024_3503588 crossref_primary_10_1109_TGRS_2025_3567751 crossref_primary_10_1016_j_eswa_2025_127029 crossref_primary_10_1109_TAES_2025_3558181 crossref_primary_10_1088_1361_6501_ad86da crossref_primary_10_1088_1361_6501_ad86db crossref_primary_10_1016_j_infrared_2025_106058 crossref_primary_10_1016_j_patcog_2024_110546 crossref_primary_10_1109_LGRS_2023_3292890 crossref_primary_10_1109_TGRS_2025_3603918 crossref_primary_10_1016_j_neucom_2024_128949 crossref_primary_10_1109_TGRS_2023_3286826 crossref_primary_10_1109_JSTARS_2025_3560200 crossref_primary_10_1117_1_JRS_18_014525 crossref_primary_10_1109_TGRS_2024_3516879 crossref_primary_10_1016_j_eswa_2025_128110 crossref_primary_10_1109_TGRS_2024_3376382 crossref_primary_10_1109_TGRS_2023_3328222 crossref_primary_10_1109_TGRS_2024_3379436 crossref_primary_10_1016_j_compbiomed_2025_110680 crossref_primary_10_1080_10095020_2023_2288179 crossref_primary_10_1109_TGRS_2025_3575591 crossref_primary_10_1016_j_infrared_2025_106144 crossref_primary_10_1109_TGRS_2025_3544645 crossref_primary_10_1109_TGRS_2024_3408045 crossref_primary_10_1109_JSTARS_2024_3509684 crossref_primary_10_1109_TGRS_2025_3580937 crossref_primary_10_1016_j_aei_2024_102611 crossref_primary_10_1109_TGRS_2024_3425658 crossref_primary_10_1109_TGRS_2025_3534838 crossref_primary_10_1155_2023_1341193 crossref_primary_10_3390_app13021165 crossref_primary_10_1109_TGRS_2023_3284671 crossref_primary_10_1016_j_eswa_2023_121376 crossref_primary_10_3390_s25072030 crossref_primary_10_1109_TGRS_2025_3601517 crossref_primary_10_1109_TCE_2025_3527678 crossref_primary_10_1109_LGRS_2024_3521119 crossref_primary_10_1109_TGRS_2024_3423492 crossref_primary_10_1016_j_optlastec_2024_111867 crossref_primary_10_1109_TGRS_2024_3388261 crossref_primary_10_3390_bioengineering10060722 crossref_primary_10_1016_j_isprsjprs_2025_03_002 crossref_primary_10_1109_LGRS_2023_3303896 crossref_primary_10_3390_rs17050818 crossref_primary_10_1016_j_eswa_2025_128373 crossref_primary_10_1109_TGRS_2024_3357706 crossref_primary_10_1109_TGRS_2024_3481268 crossref_primary_10_1109_TGRS_2025_3542368 crossref_primary_10_1109_TGRS_2023_3304311 crossref_primary_10_3390_rs15174281 crossref_primary_10_1109_JSTARS_2024_3521036 crossref_primary_10_1109_TIP_2024_3501853 crossref_primary_10_1016_j_patcog_2024_110976 crossref_primary_10_1109_LSP_2025_3577121 crossref_primary_10_3390_rs16183532 crossref_primary_10_1109_TGRS_2024_3492277 crossref_primary_10_1016_j_anucene_2025_111443 crossref_primary_10_1007_s11227_025_07695_9 crossref_primary_10_1109_LGRS_2024_3432629 crossref_primary_10_3390_s23167205 crossref_primary_10_1016_j_engappai_2024_108355 crossref_primary_10_1109_JSTARS_2022_3230835 crossref_primary_10_3390_app132111898 crossref_primary_10_3390_rs17142502 crossref_primary_10_1109_JSEN_2025_3569157 crossref_primary_10_1016_j_patcog_2024_110983 crossref_primary_10_1109_TGRS_2025_3588117 crossref_primary_10_1109_TIM_2024_3485456 crossref_primary_10_1016_j_eswa_2024_124731 crossref_primary_10_1016_j_neunet_2023_08_057 crossref_primary_10_1109_LGRS_2025_3528947 crossref_primary_10_1109_LGRS_2024_3398581 crossref_primary_10_3390_app13010317 crossref_primary_10_1109_LGRS_2024_3398106 crossref_primary_10_1016_j_optlastec_2025_113001 crossref_primary_10_32604_cmc_2025_060363 crossref_primary_10_1016_j_srs_2025_100201 crossref_primary_10_3390_app15094966 crossref_primary_10_1016_j_knosys_2024_112535 crossref_primary_10_1016_j_neunet_2024_106224 crossref_primary_10_1109_TGRS_2023_3321614 crossref_primary_10_1016_j_eswa_2024_125732 crossref_primary_10_1007_s11227_024_06067_z crossref_primary_10_1109_TIM_2024_3522435 crossref_primary_10_1007_s40747_024_01410_6 crossref_primary_10_21595_jme_2025_24682 crossref_primary_10_1007_s41060_025_00881_1 crossref_primary_10_1109_TGRS_2024_3492256 crossref_primary_10_1109_ACCESS_2023_3344644 crossref_primary_10_1109_LGRS_2025_3557021 crossref_primary_10_1007_s00371_024_03615_9 crossref_primary_10_1080_01431161_2023_2275322 crossref_primary_10_3390_rs17081341 crossref_primary_10_1109_JSTARS_2025_3545014 crossref_primary_10_1109_JSTARS_2025_3599617 crossref_primary_10_1109_JSEN_2024_3437474 crossref_primary_10_1109_TNNLS_2023_3331004 crossref_primary_10_3390_rs16061080 crossref_primary_10_1109_TGRS_2023_3236471 crossref_primary_10_1109_TGRS_2024_3387125 crossref_primary_10_1016_j_sigpro_2023_109183 crossref_primary_10_1109_JSTARS_2023_3268312 crossref_primary_10_1016_j_neucom_2024_127685 crossref_primary_10_1109_TGRS_2024_3515648 crossref_primary_10_1016_j_measurement_2025_117890 crossref_primary_10_3390_electronics14050858 crossref_primary_10_1016_j_engappai_2025_110734 crossref_primary_10_1109_TGRS_2025_3564634 crossref_primary_10_1007_s40747_024_01726_3 crossref_primary_10_3390_electronics14173547 crossref_primary_10_1109_TGRS_2023_3326545 crossref_primary_10_1016_j_imavis_2023_104736 crossref_primary_10_3390_s24248030 crossref_primary_10_1186_s13634_023_01028_9 crossref_primary_10_1109_TGRS_2023_3291356 crossref_primary_10_2478_amns_2025_0325 crossref_primary_10_1109_TGRS_2025_3529749 crossref_primary_10_1016_j_optlastec_2025_113691 crossref_primary_10_1016_j_knosys_2025_112963 crossref_primary_10_1109_TGRS_2024_3422404 crossref_primary_10_1016_j_imavis_2023_104744 crossref_primary_10_1109_JSTARS_2024_3465831 crossref_primary_10_1080_01431161_2023_2295831 crossref_primary_10_1109_TGRS_2024_3374237 crossref_primary_10_1109_TGRS_2024_3452175 crossref_primary_10_1109_TGRS_2023_3324497 crossref_primary_10_1016_j_engappai_2025_110917 crossref_primary_10_1109_TGRS_2024_3446608 crossref_primary_10_3390_rs17091548 crossref_primary_10_1109_JSTARS_2025_3532039 crossref_primary_10_1109_JSTARS_2023_3278295 crossref_primary_10_1109_JSTARS_2024_3509993 crossref_primary_10_1371_journal_pone_0322705 crossref_primary_10_1109_LSP_2024_3356411 crossref_primary_10_1016_j_compag_2024_109880 crossref_primary_10_3390_rs17020250 crossref_primary_10_1109_TGRS_2025_3578927 crossref_primary_10_3390_app15063373 crossref_primary_10_1016_j_eswa_2025_129046 crossref_primary_10_1016_j_dsp_2025_104988 crossref_primary_10_1016_j_imavis_2023_104718 crossref_primary_10_1109_ACCESS_2023_3322371 crossref_primary_10_1016_j_imavis_2023_104717 crossref_primary_10_1109_LGRS_2024_3358953 crossref_primary_10_1109_LGRS_2025_3547899 crossref_primary_10_1109_ACCESS_2024_3485499 crossref_primary_10_1109_JSEN_2025_3546966 crossref_primary_10_3390_electronics13071400 crossref_primary_10_1016_j_optlastec_2025_113557 crossref_primary_10_1109_TAES_2025_3564932 crossref_primary_10_1016_j_jag_2024_103662 crossref_primary_10_1109_TGRS_2025_3535096 crossref_primary_10_3390_electronics12234820 crossref_primary_10_1109_TGRS_2024_3521947 crossref_primary_10_1109_JSTARS_2024_3399310 crossref_primary_10_1109_TGRS_2025_3597777 crossref_primary_10_3390_rs16060979 crossref_primary_10_1007_s11227_025_07149_2 crossref_primary_10_1109_TGRS_2023_3279253 crossref_primary_10_1016_j_imavis_2023_104721 crossref_primary_10_1109_TGRS_2023_3346041 crossref_primary_10_1016_j_ecoinf_2025_103078 crossref_primary_10_3390_rs15235539 crossref_primary_10_3390_app14104132 crossref_primary_10_1049_ipr2_12919 crossref_primary_10_3390_s24134227 crossref_primary_10_1007_s00371_024_03284_8 crossref_primary_10_1109_TGRS_2023_3261964 crossref_primary_10_1016_j_optlastec_2024_111221 crossref_primary_10_1016_j_neucom_2025_130428 crossref_primary_10_32604_cmc_2024_056075 crossref_primary_10_1109_TGRS_2025_3603784 crossref_primary_10_1007_s11042_024_18866_w crossref_primary_10_1109_TGRS_2025_3588885 crossref_primary_10_3390_rs15041076 crossref_primary_10_3390_s23146477 crossref_primary_10_1109_ACCESS_2024_3486567 crossref_primary_10_1109_TMM_2024_3413529 crossref_primary_10_1007_s11042_023_17355_w crossref_primary_10_1109_TGRS_2023_3282951 crossref_primary_10_1109_TGRS_2024_3504598 crossref_primary_10_3390_rs17122072 crossref_primary_10_1109_TCSVT_2025_3535939 crossref_primary_10_1016_j_optlastec_2025_113894 crossref_primary_10_1109_JSTARS_2023_3310612 crossref_primary_10_1109_TGRS_2024_3504594 crossref_primary_10_1109_TGRS_2025_3589983 crossref_primary_10_1016_j_jag_2025_104801 crossref_primary_10_1109_TGRS_2025_3589602 crossref_primary_10_1109_TGRS_2025_3588753 crossref_primary_10_1109_TGRS_2025_3585489 crossref_primary_10_1016_j_foohum_2024_100365 crossref_primary_10_1016_j_inffus_2023_102148 crossref_primary_10_1109_JSTARS_2024_3354455 crossref_primary_10_1109_JSTARS_2023_3293593 crossref_primary_10_1109_TGRS_2024_3362471 crossref_primary_10_3233_JIFS_230721 crossref_primary_10_1007_s00500_025_10641_9 crossref_primary_10_1016_j_eswa_2023_120829 crossref_primary_10_1109_LGRS_2024_3431955 crossref_primary_10_1109_TGRS_2023_3314012 crossref_primary_10_1109_TAES_2024_3512525 crossref_primary_10_1016_j_knosys_2025_113840 crossref_primary_10_3390_agriculture15151598 crossref_primary_10_1109_ACCESS_2023_3332121 crossref_primary_10_1109_TGRS_2024_3434430 crossref_primary_10_1109_TGRS_2024_3471865 crossref_primary_10_1109_JSTARS_2024_3462514 crossref_primary_10_1109_JSTARS_2025_3599566 crossref_primary_10_1109_LGRS_2023_3312734 crossref_primary_10_1109_TGRS_2024_3355947 crossref_primary_10_1109_TCSVT_2025_3528262 crossref_primary_10_3390_electronics12224611 crossref_primary_10_3390_app13169180 crossref_primary_10_1016_j_engappai_2023_107241 crossref_primary_10_3390_rs15235619 crossref_primary_10_1109_JSEN_2024_3466397 crossref_primary_10_1080_01431161_2023_2255352 crossref_primary_10_3389_fpls_2023_1105601 crossref_primary_10_1038_s41598_025_86830_1 crossref_primary_10_1109_TGRS_2025_3603991 crossref_primary_10_1016_j_neunet_2023_05_037 crossref_primary_10_1109_TGRS_2024_3458896 crossref_primary_10_1016_j_patcog_2025_112127 crossref_primary_10_1109_TGRS_2024_3350024 crossref_primary_10_1080_13682199_2023_2183621 crossref_primary_10_1016_j_inffus_2024_102787 crossref_primary_10_1109_TGRS_2025_3541441 crossref_primary_10_1016_j_optlastec_2025_113867 crossref_primary_10_3390_electronics14091776 crossref_primary_10_1109_TIP_2024_3391011 crossref_primary_10_1016_j_jag_2023_103331 crossref_primary_10_1109_TGRS_2024_3468441 crossref_primary_10_1109_TGRS_2024_3443280 crossref_primary_10_1109_TGRS_2023_3307508 crossref_primary_10_1109_TGRS_2024_3470514 crossref_primary_10_1109_TIM_2025_3545998 crossref_primary_10_1016_j_neunet_2023_05_047 |
| Cites_doi | 10.1109/TIP.2013.2281420 10.1109/CVPR.2012.6247743 10.1109/TGRS.2020.3012981 10.1109/TIP.2017.2705426 10.1109/TIP.2020.3028457 10.1109/ACCESS.2019.2944661 10.1109/CVPR.2010.5539929 10.1109/MGRS.2021.3115137 10.1109/TGRS.2020.3020823 10.1109/TIP.2020.2975984 10.1109/TIP.2019.2924171 10.1109/ICCV.2019.00860 10.1016/j.patcog.2020.107404 10.1109/ICCV.2019.00615 10.1016/j.infrared.2014.10.022 10.1109/JSTARS.2017.2700023 10.1016/j.neucom.2017.07.017 10.1109/TGRS.2020.3015157 10.1109/LGRS.2017.2729512 10.1016/j.patcog.2020.107762 10.1007/978-3-030-01234-2_1 10.1109/TIP.2020.3037472 10.1109/TIP.2021.3122102 10.1109/TAES.2015.140878 10.21629/JSEE.2018.05.07 10.1016/j.patcog.2016.04.002 10.1109/TAES.2020.3024391 10.1109/CVPR.2018.00745 10.1109/ICCV48922.2021.00717 10.1109/TGRS.2013.2242477 10.1109/TGRS.2019.2911513 10.1109/CVPR.2018.00377 10.1016/j.infrared.2011.10.006 10.1109/TGRS.2020.3044958 10.1109/ACCESS.2021.3089376 10.1109/TIP.2015.2496289 10.3390/rs11040382 10.1109/TGRS.2020.3016820 10.1109/WACV45572.2020.9093464 10.1109/TIP.2021.3092578 10.1109/CVPR.2007.383267 10.1109/TAES.2019.2894050 10.1109/TPAMI.2016.2644615 10.1109/CVPR.2015.7298965 10.1109/LGRS.2021.3050828 10.1109/WACV48630.2021.00099 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 Distributed under a Creative Commons Attribution 4.0 International License |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 – notice: Distributed under a Creative Commons Attribution 4.0 International License |
| DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 1XC |
| DOI | 10.1109/TIP.2022.3228497 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic Hyper Article en Ligne (HAL) |
| DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitleList | PubMed Technology Research Database MEDLINE - Academic |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Applied Sciences Engineering Computer Science |
| EISSN | 1941-0042 |
| EndPage | 1 |
| ExternalDocumentID | oai:HAL:hal-04473605v1 37015404 10_1109_TIP_2022_3228497 9989433 |
| Genre | orig-research Journal Article |
| GrantInformation_xml | – fundername: AXA Research Fund funderid: 10.13039/501100001961 – fundername: MIAI@Grenoble Alpes grantid: ANR-19-P3IA-0003 – fundername: National Natural Science Foundation of China grantid: 42271350; 62101045 funderid: 10.13039/501100001809 |
| GroupedDBID | --- -~X .DC 0R~ 29I 4.4 5GY 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS F5P HZ~ IFIPE IPLJI JAVBF LAI M43 MS~ O9- OCL P2P RIA RIE RNS TAE TN5 53G 5VS AAYXX ABFSI AETIX AGSQL AI. AIBXA ALLEH CITATION E.L EJD H~9 ICLAB IFJZH VH1 AAYOK NPM RIG 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 1XC |
| ID | FETCH-LOGICAL-c381t-8bba76490f1873a32367d3478197b44f04a734af3acb5f127a61ac723c7bb2c13 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 610 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000902111900026&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 1057-7149 1941-0042 |
| IngestDate | Tue Oct 14 20:53:00 EDT 2025 Thu Oct 02 10:25:47 EDT 2025 Mon Jun 30 10:22:33 EDT 2025 Sun Apr 06 01:21:17 EDT 2025 Sat Nov 29 03:21:16 EST 2025 Tue Nov 18 20:59:37 EST 2025 Wed Aug 27 02:29:12 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Keywords | Infrared small object deep learning feature interaction attention mechanism deep multi-scale feature local and global context information |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 Distributed under a Creative Commons Attribution 4.0 International License: http://creativecommons.org/licenses/by/4.0 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c381t-8bba76490f1873a32367d3478197b44f04a734af3acb5f127a61ac723c7bb2c13 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0002-3212-9584 0000-0003-4817-2875 |
| PMID | 37015404 |
| PQID | 2756560729 |
| PQPubID | 85429 |
| PageCount | 1 |
| ParticipantIDs | pubmed_primary_37015404 crossref_citationtrail_10_1109_TIP_2022_3228497 hal_primary_oai_HAL_hal_04473605v1 proquest_miscellaneous_2796161192 proquest_journals_2756560729 crossref_primary_10_1109_TIP_2022_3228497 ieee_primary_9989433 |
| PublicationCentury | 2000 |
| PublicationDate | 2023-01-01 |
| PublicationDateYYYYMMDD | 2023-01-01 |
| PublicationDate_xml | – month: 01 year: 2023 text: 2023-01-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States – name: New York |
| PublicationTitle | IEEE transactions on image processing |
| PublicationTitleAbbrev | TIP |
| PublicationTitleAlternate | IEEE Trans Image Process |
| PublicationYear | 2023 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Institute of Electrical and Electronics Engineers |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) – name: Institute of Electrical and Electronics Engineers |
| References | ref13 ref12 ref15 ref14 ref53 Bingwei (ref46) 2020; 5 ref52 ref11 ref10 ref54 ref17 ref16 ref19 Wu (ref36) 2019 ref51 Park (ref43) 2018 ref50 ref45 ref48 ref47 ref42 ref41 ref44 ref49 ref8 ref7 ref9 ref4 ref3 ref6 Li (ref32) 2021 ref5 ref35 ref34 ref37 ref31 ref30 ref33 ref2 ref1 ref38 Guo (ref40) 2021 Zhao (ref27) 2019 ref24 ref23 ref26 ref25 ref20 ref22 ref21 ref28 ref29 Vaswani (ref39) Wang (ref18); 10462 |
| References_xml | – ident: ref7 doi: 10.1109/TIP.2013.2281420 – ident: ref47 doi: 10.1109/CVPR.2012.6247743 – ident: ref25 doi: 10.1109/TGRS.2020.3012981 – ident: ref3 doi: 10.1109/TIP.2017.2705426 – ident: ref5 doi: 10.1109/TIP.2020.3028457 – year: 2021 ident: ref40 article-title: Beyond self-attention: External attention using two linear layers for visual tasks publication-title: arXiv:2105.02358 – ident: ref22 doi: 10.1109/ACCESS.2019.2944661 – ident: ref48 doi: 10.1109/CVPR.2010.5539929 – ident: ref2 doi: 10.1109/MGRS.2021.3115137 – ident: ref33 doi: 10.1109/TGRS.2020.3020823 – ident: ref53 doi: 10.1109/TIP.2020.2975984 – ident: ref4 doi: 10.1109/TIP.2019.2924171 – ident: ref45 doi: 10.1109/ICCV.2019.00860 – ident: ref44 doi: 10.1016/j.patcog.2020.107404 – ident: ref37 doi: 10.1109/ICCV.2019.00615 – ident: ref10 doi: 10.1016/j.infrared.2014.10.022 – ident: ref14 doi: 10.1109/JSTARS.2017.2700023 – ident: ref20 doi: 10.1016/j.neucom.2017.07.017 – year: 2018 ident: ref43 article-title: BAM: Bottleneck attention module publication-title: arXiv:1807.06514 – ident: ref16 doi: 10.1109/TGRS.2020.3015157 – ident: ref15 doi: 10.1109/LGRS.2017.2729512 – ident: ref29 doi: 10.1016/j.patcog.2020.107762 – ident: ref42 doi: 10.1007/978-3-030-01234-2_1 – year: 2019 ident: ref27 article-title: TBC-Net: A real-time detector for infrared small target detection using semantic constraint publication-title: arXiv:2001.05852 – ident: ref51 doi: 10.1109/TIP.2020.3037472 – ident: ref52 doi: 10.1109/TIP.2021.3122102 – ident: ref11 doi: 10.1109/TAES.2015.140878 – ident: ref21 doi: 10.21629/JSEE.2018.05.07 – ident: ref9 doi: 10.1016/j.patcog.2016.04.002 – ident: ref23 doi: 10.1109/TAES.2020.3024391 – ident: ref41 doi: 10.1109/CVPR.2018.00745 – ident: ref28 doi: 10.1109/ICCV48922.2021.00717 – ident: ref8 doi: 10.1109/TGRS.2013.2242477 – ident: ref12 doi: 10.1109/TGRS.2019.2911513 – year: 2019 ident: ref36 article-title: FastFCN: Rethinking dilated convolution in the backbone for semantic segmentation publication-title: arXiv:1903.11816 – ident: ref38 doi: 10.1109/CVPR.2018.00377 – ident: ref6 doi: 10.1016/j.infrared.2011.10.006 – ident: ref31 doi: 10.1109/TGRS.2020.3044958 – volume: 5 start-page: 12 issue: 3 year: 2020 ident: ref46 article-title: A dataset for infrared detection and tracking of dim-small aircraft targets under ground/air background publication-title: China Sci. Data, Online Version English Chinese – ident: ref26 doi: 10.1109/ACCESS.2021.3089376 – ident: ref1 doi: 10.1109/TIP.2015.2496289 – ident: ref13 doi: 10.3390/rs11040382 – ident: ref17 doi: 10.1109/TGRS.2020.3016820 – year: 2021 ident: ref32 article-title: Dense nested attention network for infrared small target detection publication-title: arXiv:2106.00487 – ident: ref50 doi: 10.1109/WACV45572.2020.9093464 – ident: ref54 doi: 10.1109/TIP.2021.3092578 – ident: ref49 doi: 10.1109/CVPR.2007.383267 – ident: ref19 doi: 10.1109/TAES.2019.2894050 – ident: ref35 doi: 10.1109/TPAMI.2016.2644615 – ident: ref34 doi: 10.1109/CVPR.2015.7298965 – ident: ref24 doi: 10.1109/LGRS.2021.3050828 – ident: ref30 doi: 10.1109/WACV48630.2021.00099 – volume: 10462 volume-title: Proc. SPIE ident: ref18 article-title: Small target detection in infrared image using convolutional neural networks – start-page: 5998 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref39 article-title: Attention is all you need |
| SSID | ssj0014516 |
| Score | 2.7690465 |
| Snippet | Learning-based infrared small object detection methods currently rely heavily on the classification backbone network. This tends to result in tiny object loss... |
| SourceID | hal proquest pubmed crossref ieee |
| SourceType | Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 1 |
| SubjectTerms | Artificial Intelligence attention mechanism Computer networks Computer Science Context Datasets Decoding deep learning deep multi-scale feature Feature extraction feature interaction Image resolution Infrared imagery Infrared small object Integrated circuits Learning local and global context information Maintenance Modules Object detection Object recognition Semantics Visualization |
| Title | UIU-Net: U-Net in U-Net for Infrared Small Object Detection |
| URI | https://ieeexplore.ieee.org/document/9989433 https://www.ncbi.nlm.nih.gov/pubmed/37015404 https://www.proquest.com/docview/2756560729 https://www.proquest.com/docview/2796161192 https://hal.science/hal-04473605 |
| Volume | 32 |
| WOSCitedRecordID | wos000902111900026&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 1941-0042 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014516 issn: 1057-7149 databaseCode: RIE dateStart: 19920101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fb9MwED6tEw_wsMEGLLBNBvGCRFYndusYniZgWiVUJrFKfYtsxxaTunRq0_393DlptAdA2lOs5PJDvnP8ne_8HcAHy8dGUNWASoZRKhUf4X8wL9JCyNyKqtIhtMUm1HRazOf6agc-9XthvPcx-cyfUTPG8qul29BS2VBHtnAxgIFSqt2r1UcMqOBsjGyOVKoQ9m9DklwPrydX6Ajm-RkabyGJ3unBFDT4TQmQsbLKv0FmnGwu9h_3mc9hrwOV7Ly1ghew4-sD2O8AJuuG7_oAnj1gHzyEL7PJLJ365jOLB3ZTdw3EsWxShxXlprNft2axYD8trdewb76JqVv1S5hdfL_-epl2tRRSh3NykxbWGjWWmoesUMIIIm6rBO0z1cpKGbg0SkgThHF2FLJcmXFmnMqFU9bmLhOvYLde1v4IWIGgztncIzAIstKIKWQw2L1BOfRPPE9guO3e0nVE41TvYlFGh4PrEhVSkkLKTiEJfOzvuGtJNv4j-x411osRO_bl-Y-SznEplUD37D5L4JDU0kt1GkngeKvgshuq65L47xH2oZORwLv-Mg4yipyY2i83JKPHCI0RDSfwujWM_tlCEQzl8s3f3_kWnlKF-nbV5hh2m9XGn8ATd9_crFenaMnz4jRa8h--d-iN |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fb9MwED5tAwl4YLDBCAwwiBekZU1ip07gaQKmVpQyiVbam2U7tpjUpahN9_dz56TRHgCJp1jJ5Yd85_g73_k7gHcmGWpOVQMq4fNYyCTH_2BWxAUXmeFVVXrfFpuQ02lxeVle7MBJvxfGOReSz9wpNUMsv1raDS2VDcrAFs534U4uRJa2u7X6mAGVnA2xzVzGEoH_NiiZlIPZ-AJdwSw7RfMtBBE83ZqEdn9SCmSorfJ3mBmmm_P9__vQR_Cwg5XsrLWDx7Dj6gPY7yAm6wbw-gAe3OIfPISP8_E8nrrmAwsHdlV3DUSybFz7FWWnsx_XerFg3w2t2LDPrgnJW_UTmJ9_mX0axV01hdjirNzEhTFaDkWZ-LSQXHOibqs47TQtpRHCJ0JLLrTn2prcp5nUw1RbmXErjclsyp_CXr2s3TNgBcI6azKH0MCLqkRUIbzG7vXSoofikggG2-5VtqMap4oXCxVcjqRUqBBFClGdQiJ439_xq6XZ-IfsW9RYL0b82KOziaJziRCSo4N2k0ZwSGrppTqNRHC8VbDqButaEQM-Aj90MyJ401_GYUaxE1275YZkyiGCY8TDERy1htE_m0sCool4_ud3voZ7o9m3iZqMp19fwH2qV9-u4RzDXrPauJdw1940V-vVq2DPvwGFlers |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=UIU-Net%3A+U-Net+in+U-Net+for+Infrared+Small+Object+Detection&rft.jtitle=IEEE+transactions+on+image+processing&rft.au=Wu%2C+Xin&rft.au=Hong%2C+Danfeng&rft.au=Chanussot%2C+Jocelyn&rft.date=2023-01-01&rft.issn=1941-0042&rft.eissn=1941-0042&rft.volume=32&rft.spage=364&rft_id=info:doi/10.1109%2FTIP.2022.3228497&rft.externalDBID=NO_FULL_TEXT |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1057-7149&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1057-7149&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1057-7149&client=summon |