Vision Transformers for Single Image Dehazing
Image dehazing is a representative low-level vision task that estimates latent haze-free images from hazy images. In recent years, convolutional neural network-based methods have dominated image dehazing. However, vision Transformers, which has recently made a breakthrough in high-level vision tasks...
Uloženo v:
| Vydáno v: | IEEE transactions on image processing Ročník 32; s. 1 |
|---|---|
| Hlavní autoři: | , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
United States
IEEE
01.01.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 1057-7149, 1941-0042, 1941-0042 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Abstract | Image dehazing is a representative low-level vision task that estimates latent haze-free images from hazy images. In recent years, convolutional neural network-based methods have dominated image dehazing. However, vision Transformers, which has recently made a breakthrough in high-level vision tasks, has not brought new dimensions to image dehazing. We start with the popular Swin Transformer and find that several of its key designs are unsuitable for image dehazing. To this end, we propose DehazeFormer, which consists of various improvements, such as the modified normalization layer, activation function, and spatial information aggregation scheme. We train multiple variants of DehazeFormer on various datasets to demonstrate its effectiveness. Specifically, on the most frequently used SOTS indoor set, our small model outperforms FFA-Net with only 25% #Param and 5% computational cost. To the best of our knowledge, our large model is the first method with the PSNR over 40 dB on the SOTS indoor set, dramatically outperforming the previous state-of-the-art methods. We also collect a large-scale realistic remote sensing dehazing dataset for evaluating the method's capability to remove highly non-homogeneous haze. We share our code and dataset at https://github.com/IDKiro/DehazeFormer. |
|---|---|
| AbstractList | Image dehazing is a representative low-level vision task that estimates latent haze-free images from hazy images. In recent years, convolutional neural network-based methods have dominated image dehazing. However, vision Transformers, which has recently made a breakthrough in high-level vision tasks, has not brought new dimensions to image dehazing. We start with the popular Swin Transformer and find that several of its key designs are unsuitable for image dehazing. To this end, we propose DehazeFormer, which consists of various improvements, such as the modified normalization layer, activation function, and spatial information aggregation scheme. We train multiple variants of DehazeFormer on various datasets to demonstrate its effectiveness. Specifically, on the most frequently used SOTS indoor set, our small model outperforms FFA-Net with only 25% #Param and 5% computational cost. To the best of our knowledge, our large model is the first method with the PSNR over 40 dB on the SOTS indoor set, dramatically outperforming the previous state-of-the-art methods. We also collect a large-scale realistic remote sensing dehazing dataset for evaluating the method’s capability to remove highly non-homogeneous haze. We share our code and dataset at https://github.com/IDKiro/DehazeFormer . Image dehazing is a representative low-level vision task that estimates latent haze-free images from hazy images. In recent years, convolutional neural network-based methods have dominated image dehazing. However, vision Transformers, which has recently made a breakthrough in high-level vision tasks, has not brought new dimensions to image dehazing. We start with the popular Swin Transformer and find that several of its key designs are unsuitable for image dehazing. To this end, we propose DehazeFormer, which consists of various improvements, such as the modified normalization layer, activation function, and spatial information aggregation scheme. We train multiple variants of DehazeFormer on various datasets to demonstrate its effectiveness. Specifically, on the most frequently used SOTS indoor set, our small model outperforms FFA-Net with only 25% #Param and 5% computational cost. To the best of our knowledge, our large model is the first method with the PSNR over 40 dB on the SOTS indoor set, dramatically outperforming the previous state-of-the-art methods. We also collect a large-scale realistic remote sensing dehazing dataset for evaluating the method's capability to remove highly non-homogeneous haze. We share our code and dataset at https://github.com/IDKiro/DehazeFormer.Image dehazing is a representative low-level vision task that estimates latent haze-free images from hazy images. In recent years, convolutional neural network-based methods have dominated image dehazing. However, vision Transformers, which has recently made a breakthrough in high-level vision tasks, has not brought new dimensions to image dehazing. We start with the popular Swin Transformer and find that several of its key designs are unsuitable for image dehazing. To this end, we propose DehazeFormer, which consists of various improvements, such as the modified normalization layer, activation function, and spatial information aggregation scheme. We train multiple variants of DehazeFormer on various datasets to demonstrate its effectiveness. Specifically, on the most frequently used SOTS indoor set, our small model outperforms FFA-Net with only 25% #Param and 5% computational cost. To the best of our knowledge, our large model is the first method with the PSNR over 40 dB on the SOTS indoor set, dramatically outperforming the previous state-of-the-art methods. We also collect a large-scale realistic remote sensing dehazing dataset for evaluating the method's capability to remove highly non-homogeneous haze. We share our code and dataset at https://github.com/IDKiro/DehazeFormer. |
| Author | Du, Xin Song, Yuda He, Zhuqing Qian, Hui |
| Author_xml | – sequence: 1 givenname: Yuda orcidid: 0000-0002-7339-1335 surname: Song fullname: Song, Yuda organization: Zhejiang University, Hangzhou, China – sequence: 2 givenname: Zhuqing orcidid: 0000-0001-9606-8402 surname: He fullname: He, Zhuqing organization: Zhejiang University, Hangzhou, China – sequence: 3 givenname: Hui surname: Qian fullname: Qian, Hui organization: Zhejiang University, Hangzhou, China – sequence: 4 givenname: Xin orcidid: 0000-0002-6215-9733 surname: Du fullname: Du, Xin organization: Zhejiang University, Hangzhou, China |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37030760$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kLtPwzAQhy0Eog_YGRCKxMKSco4dJx5ReVWqBBKFNXLsS0mVR7GTAf56XLUg1IHpLPv7-e6-ETls2gYJOaMwoRTk9WL2PIkgYhMWxSIR7IAMqeQ0BODRoT9DnIQJ5XJARs6tACiPqTgmA5YAg0TAkIRvpSvbJlhY1biitTVaF_gavJTNssJgVqslBrf4rr78xQk5KlTl8HRXx-T1_m4xfQznTw-z6c081IynXZgXqQGKKKXQqdLcxLFQKcTKpByZkLEGYXLDacGEf-aRjjE1uQdNURSYsDG52v67tu1Hj67L6tJprCrVYNu7LEpk6vfym3n0cg9dtb1t_HQbCihLI0k9dbGj-rxGk61tWSv7mf2I8ABsAW1b5ywWvwiFbOM6866zjets59pHxF5El53qvM3OqrL6L3i-DZaI-KePH4RJyb4BnUmJsQ |
| CODEN | IIPRE4 |
| CitedBy_id | crossref_primary_10_1109_JSTARS_2024_3447219 crossref_primary_10_1049_ipr2_13146 crossref_primary_10_1109_ACCESS_2024_3378737 crossref_primary_10_1016_j_dsp_2024_104722 crossref_primary_10_1080_01431161_2025_2475523 crossref_primary_10_1038_s41598_023_39524_5 crossref_primary_10_1109_TCSVT_2024_3497594 crossref_primary_10_1016_j_jvcir_2025_104516 crossref_primary_10_3390_math12233650 crossref_primary_10_1109_TIP_2025_3602657 crossref_primary_10_1109_TMI_2023_3331488 crossref_primary_10_3390_electronics14153099 crossref_primary_10_1016_j_neucom_2025_129512 crossref_primary_10_1016_j_neunet_2023_12_003 crossref_primary_10_1049_ipr2_13259 crossref_primary_10_1016_j_dsp_2024_104710 crossref_primary_10_1016_j_inffus_2025_103036 crossref_primary_10_1016_j_jag_2024_104347 crossref_primary_10_1016_j_patcog_2024_111198 crossref_primary_10_1109_LSP_2025_3551201 crossref_primary_10_3390_jimaging11090290 crossref_primary_10_1109_TITS_2023_3277709 crossref_primary_10_1016_j_engappai_2024_108933 crossref_primary_10_3390_diagnostics14080786 crossref_primary_10_1007_s11760_025_04185_6 crossref_primary_10_3390_rs15184617 crossref_primary_10_1109_TCSVT_2024_3519352 crossref_primary_10_1109_TGRS_2025_3546489 crossref_primary_10_1016_j_patcog_2024_111074 crossref_primary_10_1007_s00034_025_03058_0 crossref_primary_10_3390_jmse12071208 crossref_primary_10_1016_j_jvcir_2024_104132 crossref_primary_10_1109_LSP_2025_3530852 crossref_primary_10_1016_j_infrared_2025_105975 crossref_primary_10_1016_j_eswa_2025_127837 crossref_primary_10_1109_TGRS_2024_3489964 crossref_primary_10_1016_j_displa_2024_102714 crossref_primary_10_1007_s00371_023_03109_0 crossref_primary_10_1007_s00371_024_03794_5 crossref_primary_10_1016_j_optlaseng_2025_109233 crossref_primary_10_3390_app131810475 crossref_primary_10_3390_rs14225737 crossref_primary_10_1007_s11227_025_07089_x crossref_primary_10_1016_j_patcog_2025_112206 crossref_primary_10_1007_s11042_023_17554_5 crossref_primary_10_1109_TITS_2023_3309600 crossref_primary_10_1007_s10489_024_05443_9 crossref_primary_10_1109_JSTARS_2025_3556845 crossref_primary_10_1109_TMM_2023_3326881 crossref_primary_10_1109_TCSVT_2023_3274366 crossref_primary_10_3390_s24123972 crossref_primary_10_1007_s11207_024_02312_z crossref_primary_10_1016_j_engappai_2023_107692 crossref_primary_10_34248_bsengineering_1349643 crossref_primary_10_1016_j_knosys_2025_113426 crossref_primary_10_1007_s00371_025_03837_5 crossref_primary_10_1109_JSTARS_2025_3579431 crossref_primary_10_1109_TCSVT_2024_3412093 crossref_primary_10_1109_TIP_2025_3592546 crossref_primary_10_1109_TGRS_2025_3585894 crossref_primary_10_3390_app142411565 crossref_primary_10_1016_j_patcog_2025_111664 crossref_primary_10_1111_cgf_15221 crossref_primary_10_1007_s10489_025_06280_0 crossref_primary_10_1016_j_knosys_2025_113579 crossref_primary_10_1109_TGRS_2023_3338611 crossref_primary_10_1007_s00530_024_01630_3 crossref_primary_10_1145_3758097 crossref_primary_10_3390_app142411558 crossref_primary_10_3390_s24072041 crossref_primary_10_1007_s10489_025_06665_1 crossref_primary_10_1016_j_eswa_2025_128959 crossref_primary_10_3390_s23010043 crossref_primary_10_1016_j_neunet_2025_107495 crossref_primary_10_1109_TIP_2024_3378472 crossref_primary_10_1007_s00530_025_01845_y crossref_primary_10_3390_rs16081450 crossref_primary_10_1109_TPAMI_2025_3562211 crossref_primary_10_3390_app13085171 crossref_primary_10_1002_sdtp_17317 crossref_primary_10_1109_TCSVT_2024_3465670 crossref_primary_10_1016_j_neucom_2024_129168 crossref_primary_10_1016_j_neucom_2024_129044 crossref_primary_10_1007_s00371_025_04140_z crossref_primary_10_1007_s00371_024_03740_5 crossref_primary_10_3390_app14198633 crossref_primary_10_3390_app14135464 crossref_primary_10_3390_electronics13101900 crossref_primary_10_3390_rs17172944 crossref_primary_10_1016_j_eswa_2025_126549 crossref_primary_10_1007_s12559_023_10244_2 crossref_primary_10_1007_s11263_024_02056_0 crossref_primary_10_1049_ipr2_70079 crossref_primary_10_3390_s24144566 crossref_primary_10_1038_s41598_025_95510_z crossref_primary_10_3390_electronics13245003 crossref_primary_10_3390_electronics13173392 crossref_primary_10_1016_j_dsp_2024_104828 crossref_primary_10_3390_ai6080181 crossref_primary_10_3390_rs17061055 crossref_primary_10_1080_09540091_2025_2465448 crossref_primary_10_1109_LSP_2024_3430066 crossref_primary_10_3390_rs17030458 crossref_primary_10_1016_j_compeleceng_2025_110257 crossref_primary_10_1109_JSTARS_2025_3552582 crossref_primary_10_3390_app15105374 crossref_primary_10_1007_s11760_025_04515_8 crossref_primary_10_1016_j_imavis_2025_105700 crossref_primary_10_1016_j_engappai_2025_111385 crossref_primary_10_1109_TNNLS_2025_3561924 crossref_primary_10_1002_qre_3492 crossref_primary_10_1016_j_engappai_2025_112119 crossref_primary_10_1016_j_patcog_2025_111863 crossref_primary_10_3390_app15094785 crossref_primary_10_1109_TIP_2025_3587578 crossref_primary_10_1109_LGRS_2024_3350652 crossref_primary_10_1109_TGRS_2024_3427788 crossref_primary_10_1016_j_patrec_2025_01_022 crossref_primary_10_1038_s41598_023_36149_6 crossref_primary_10_3390_s25154755 crossref_primary_10_1007_s11227_024_06665_x crossref_primary_10_1016_j_patrec_2025_08_008 crossref_primary_10_1016_j_neucom_2024_129075 crossref_primary_10_1109_TFUZZ_2024_3512864 crossref_primary_10_3390_app15179320 crossref_primary_10_1007_s11760_025_04625_3 crossref_primary_10_1049_ipr2_70207 crossref_primary_10_1109_TGRS_2025_3526927 crossref_primary_10_1016_j_cviu_2024_104274 crossref_primary_10_1016_j_patcog_2025_111615 crossref_primary_10_1016_j_cviu_2024_104033 crossref_primary_10_1007_s13042_024_02335_9 crossref_primary_10_1016_j_asoc_2024_111873 crossref_primary_10_1016_j_neunet_2025_107739 crossref_primary_10_3390_rs17101688 crossref_primary_10_1007_s11760_024_03665_5 crossref_primary_10_1016_j_compeleceng_2025_110221 crossref_primary_10_1088_1361_6501_ad9f88 crossref_primary_10_1016_j_eswa_2025_128511 crossref_primary_10_3390_s24123917 crossref_primary_10_1016_j_inffus_2025_103104 crossref_primary_10_1007_s11760_024_03160_x crossref_primary_10_1007_s00371_023_03129_w crossref_primary_10_1016_j_dsp_2025_105037 crossref_primary_10_1016_j_imavis_2024_105212 crossref_primary_10_1016_j_infrared_2023_104837 crossref_primary_10_1117_1_JEI_33_1_013056 crossref_primary_10_1109_ACCESS_2024_3419063 crossref_primary_10_3390_math12142221 crossref_primary_10_1016_j_cviu_2025_104336 crossref_primary_10_1145_3628451 crossref_primary_10_1109_JSTARS_2025_3585939 crossref_primary_10_1007_s11227_025_07429_x crossref_primary_10_1109_TIP_2024_3456583 crossref_primary_10_1016_j_eswa_2025_128761 crossref_primary_10_1109_TIM_2025_3541646 crossref_primary_10_1109_TMM_2025_3535316 crossref_primary_10_1109_LRA_2023_3300254 crossref_primary_10_1109_TIM_2024_3379388 crossref_primary_10_1109_TPAMI_2024_3403234 crossref_primary_10_1109_LGRS_2023_3319832 crossref_primary_10_1109_JOE_2024_3519681 crossref_primary_10_1016_j_engappai_2025_110372 crossref_primary_10_1016_j_neucom_2023_126535 crossref_primary_10_1109_TMM_2025_3543063 crossref_primary_10_3390_s25061889 crossref_primary_10_1109_TPAMI_2023_3330416 crossref_primary_10_1111_phor_70018 crossref_primary_10_1016_j_cviu_2024_104222 crossref_primary_10_1111_exsy_13575 crossref_primary_10_1016_j_imavis_2024_105236 crossref_primary_10_1007_s13042_024_02510_y crossref_primary_10_32604_cmc_2023_046685 crossref_primary_10_3390_pr13082628 crossref_primary_10_1016_j_eswa_2025_128542 crossref_primary_10_1109_TGRS_2024_3438245 crossref_primary_10_1109_TGRS_2023_3325927 crossref_primary_10_1007_s10489_024_06226_y crossref_primary_10_1016_j_engappai_2025_111635 crossref_primary_10_1109_TAI_2024_3379113 crossref_primary_10_1016_j_jvcir_2024_104324 crossref_primary_10_1007_s00530_024_01599_z crossref_primary_10_1016_j_infrared_2025_106098 crossref_primary_10_1016_j_jvcir_2024_104325 crossref_primary_10_1109_TMRB_2024_3517139 crossref_primary_10_1016_j_aei_2024_102723 crossref_primary_10_1109_TMM_2024_3359769 crossref_primary_10_3390_s23198102 crossref_primary_10_1007_s11431_023_2614_8 crossref_primary_10_3390_electronics11203351 crossref_primary_10_1007_s00371_024_03551_8 crossref_primary_10_1016_j_optlastec_2025_113081 crossref_primary_10_1109_JSEN_2025_3526670 crossref_primary_10_1016_j_engappai_2023_106822 crossref_primary_10_3390_electronics13101867 crossref_primary_10_1587_transfun_2024EAP1104 crossref_primary_10_1007_s10586_024_05047_5 crossref_primary_10_1007_s11042_024_18112_3 crossref_primary_10_1007_s11263_024_02161_0 crossref_primary_10_1109_TIP_2024_3480696 crossref_primary_10_1007_s11760_024_03392_x crossref_primary_10_1016_j_aei_2024_102931 crossref_primary_10_1109_TCSVT_2025_3526647 crossref_primary_10_1016_j_imavis_2024_105014 crossref_primary_10_1016_j_imavis_2025_105659 crossref_primary_10_1007_s11263_024_02108_5 crossref_primary_10_1016_j_eswa_2025_128565 crossref_primary_10_1016_j_knosys_2025_114167 crossref_primary_10_3390_math12223553 crossref_primary_10_1109_TCSVT_2024_3429557 crossref_primary_10_1016_j_sigpro_2024_109768 crossref_primary_10_1016_j_sigpro_2023_109225 crossref_primary_10_1109_TETCI_2024_3451550 crossref_primary_10_1016_j_jvcir_2025_104407 crossref_primary_10_1007_s00371_023_02987_8 crossref_primary_10_1016_j_imavis_2024_105129 crossref_primary_10_1109_TGRS_2025_3526993 crossref_primary_10_3390_electronics14132582 crossref_primary_10_1007_s10044_025_01510_0 crossref_primary_10_3390_electronics13193812 crossref_primary_10_1080_01431161_2024_2449476 crossref_primary_10_3390_sym16050511 crossref_primary_10_1016_j_engappai_2025_111666 crossref_primary_10_1109_TCE_2024_3433432 crossref_primary_10_1016_j_eswa_2024_123731 crossref_primary_10_1109_TAI_2024_3421175 crossref_primary_10_3390_electronics13122269 crossref_primary_10_1007_s10462_023_10595_0 crossref_primary_10_1080_03772063_2025_2479761 crossref_primary_10_1109_TETCI_2024_3386838 crossref_primary_10_1016_j_engappai_2025_110981 crossref_primary_10_1109_TETCI_2024_3382233 crossref_primary_10_1016_j_engappai_2024_109486 crossref_primary_10_1109_TPAMI_2024_3416731 crossref_primary_10_1109_TGRS_2025_3600540 crossref_primary_10_3390_d17020139 crossref_primary_10_3390_f14102062 crossref_primary_10_3390_rs16193641 crossref_primary_10_1016_j_eswa_2024_125901 crossref_primary_10_1007_s11760_024_03767_0 crossref_primary_10_1016_j_neucom_2024_128955 crossref_primary_10_1016_j_engappai_2024_109236 crossref_primary_10_1016_j_measurement_2025_118565 crossref_primary_10_1016_j_eswa_2025_127263 crossref_primary_10_1016_j_inffus_2025_102977 crossref_primary_10_1038_s41598_024_73866_y crossref_primary_10_1016_j_mejo_2025_106782 crossref_primary_10_1016_j_neucom_2025_131130 crossref_primary_10_3390_electronics14163245 crossref_primary_10_1002_advs_202309998 crossref_primary_10_1007_s13042_022_01753_x crossref_primary_10_1109_ACCESS_2025_3588390 crossref_primary_10_3390_rs17152664 crossref_primary_10_1109_TGRS_2024_3435470 crossref_primary_10_1016_j_eswa_2025_128488 crossref_primary_10_1007_s11263_024_02182_9 crossref_primary_10_1016_j_isprsjprs_2024_09_034 crossref_primary_10_1109_JOE_2024_3525150 crossref_primary_10_1016_j_neucom_2025_131002 crossref_primary_10_1007_s00521_024_10930_8 crossref_primary_10_3390_app15094950 crossref_primary_10_1007_s00530_023_01224_5 crossref_primary_10_1007_s11554_024_01464_2 crossref_primary_10_1109_JSTARS_2024_3454754 crossref_primary_10_1016_j_aei_2023_102326 crossref_primary_10_1016_j_engappai_2024_109012 crossref_primary_10_1109_TGRS_2025_3551286 crossref_primary_10_1007_s11760_025_03928_9 crossref_primary_10_1049_ipr2_12995 crossref_primary_10_1177_30504554241296445 crossref_primary_10_1080_09500340_2025_2502109 crossref_primary_10_1016_j_isprsjprs_2024_09_001 crossref_primary_10_1016_j_inffus_2024_102277 crossref_primary_10_1109_ACCESS_2024_3459588 crossref_primary_10_1016_j_rse_2025_114913 crossref_primary_10_1016_j_cviu_2024_104069 crossref_primary_10_1016_j_engappai_2024_108346 crossref_primary_10_1007_s11760_024_03373_0 crossref_primary_10_1109_TCSVT_2024_3411655 crossref_primary_10_1117_1_JEI_34_1_013051 crossref_primary_10_3390_s25144472 crossref_primary_10_1016_j_neucom_2024_128741 crossref_primary_10_1007_s13042_023_02055_6 crossref_primary_10_1016_j_knosys_2023_111156 crossref_primary_10_1109_JSTARS_2023_3312515 crossref_primary_10_1109_LGRS_2024_3450181 crossref_primary_10_1007_s11042_023_17010_4 crossref_primary_10_1007_s11063_023_11301_5 crossref_primary_10_1007_s40747_025_02076_4 crossref_primary_10_1007_s00034_024_02837_5 crossref_primary_10_1016_j_measurement_2025_116697 crossref_primary_10_1080_13682199_2024_2444778 crossref_primary_10_1016_j_neunet_2024_106107 crossref_primary_10_1007_s11554_024_01432_w crossref_primary_10_1016_j_patcog_2025_112188 crossref_primary_10_3390_rs16020225 crossref_primary_10_3390_rs17050759 crossref_primary_10_1109_TIM_2024_3374305 crossref_primary_10_3390_sym17071122 crossref_primary_10_1007_s00530_024_01462_1 crossref_primary_10_1038_s41598_025_08690_z crossref_primary_10_1109_TII_2023_3331114 crossref_primary_10_32604_cmes_2024_049737 crossref_primary_10_1016_j_cviu_2025_104495 crossref_primary_10_3390_photonics10101072 crossref_primary_10_1109_TMM_2024_3377133 crossref_primary_10_1016_j_neucom_2025_130356 crossref_primary_10_3390_atmos16091065 crossref_primary_10_1109_TMM_2024_3369979 crossref_primary_10_1109_TCE_2024_3417476 crossref_primary_10_1007_s11760_023_02752_3 crossref_primary_10_3390_app15115858 crossref_primary_10_1016_j_optlastec_2023_110039 crossref_primary_10_1109_ACCESS_2024_3461318 crossref_primary_10_3390_s25165125 crossref_primary_10_1109_TGRS_2024_3379744 crossref_primary_10_1016_j_knosys_2023_111116 crossref_primary_10_3390_electronics14101906 crossref_primary_10_1016_j_inffus_2025_103533 crossref_primary_10_1016_j_displa_2025_103212 crossref_primary_10_1007_s11760_024_03516_3 crossref_primary_10_1007_s11760_025_03829_x crossref_primary_10_1109_TIP_2025_3586514 crossref_primary_10_1016_j_neunet_2024_106689 crossref_primary_10_1109_TPAMI_2025_3568690 crossref_primary_10_1109_TIM_2025_3558788 crossref_primary_10_1007_s11042_023_17473_5 crossref_primary_10_1007_s11760_025_03870_w crossref_primary_10_1109_TGRS_2024_3474711 crossref_primary_10_3390_rs16152822 crossref_primary_10_1016_j_neucom_2025_131431 crossref_primary_10_1016_j_atres_2025_06_001 crossref_primary_10_1109_TII_2023_3345464 crossref_primary_10_1109_TITS_2025_3534472 crossref_primary_10_3390_rs17122033 crossref_primary_10_1109_ACCESS_2024_3520484 crossref_primary_10_1109_LGRS_2025_3569580 crossref_primary_10_1016_j_patcog_2025_112152 crossref_primary_10_1109_TGRS_2024_3457868 crossref_primary_10_1109_ACCESS_2025_3607857 crossref_primary_10_1109_TGRS_2025_3576979 crossref_primary_10_3390_jmse12091467 crossref_primary_10_1109_LSP_2025_3527197 crossref_primary_10_3390_jimaging9090183 crossref_primary_10_1109_TGRS_2025_3584234 crossref_primary_10_1016_j_asoc_2025_113322 crossref_primary_10_1109_TGRS_2023_3346384 crossref_primary_10_1109_JSEN_2025_3555446 crossref_primary_10_1007_s41976_023_00089_6 crossref_primary_10_1007_s00371_024_03511_2 crossref_primary_10_1016_j_ocecoaman_2025_107783 crossref_primary_10_1016_j_neucom_2025_130566 crossref_primary_10_1109_TIM_2024_3427775 crossref_primary_10_1007_s10044_025_01430_z crossref_primary_10_1016_j_inffus_2025_102930 crossref_primary_10_1109_TIV_2023_3347952 crossref_primary_10_1109_ACCESS_2025_3578195 crossref_primary_10_1016_j_patrec_2025_05_023 crossref_primary_10_1002_cav_70029 crossref_primary_10_1016_j_asoc_2025_113648 crossref_primary_10_1007_s11760_025_04411_1 crossref_primary_10_1109_JSEN_2024_3372496 crossref_primary_10_3390_rs17122058 crossref_primary_10_1109_TIP_2025_3572788 crossref_primary_10_1007_s11760_025_04434_8 crossref_primary_10_1109_TIV_2023_3331017 crossref_primary_10_1109_TNNLS_2024_3460973 crossref_primary_10_3390_electronics12244984 crossref_primary_10_1109_TGRS_2024_3518568 crossref_primary_10_1016_j_optlastec_2025_113545 crossref_primary_10_1007_s00371_025_03882_0 crossref_primary_10_1007_s11063_023_11384_0 crossref_primary_10_1088_1361_6501_ad6f35 crossref_primary_10_1109_TCSVT_2024_3414677 crossref_primary_10_1007_s10489_024_05569_w crossref_primary_10_1007_s11227_024_06638_0 crossref_primary_10_1109_LSP_2024_3353161 crossref_primary_10_3390_rs16081422 crossref_primary_10_1007_s00371_024_03556_3 crossref_primary_10_3390_s24113422 crossref_primary_10_1016_j_neucom_2025_130789 crossref_primary_10_1016_j_neucom_2025_131517 crossref_primary_10_1109_TCSVT_2024_3520816 crossref_primary_10_1109_TMM_2024_3521711 crossref_primary_10_1016_j_inffus_2024_102790 crossref_primary_10_1016_j_inffus_2023_102151 crossref_primary_10_1371_journal_pone_0286711 crossref_primary_10_1016_j_neucom_2024_128105 crossref_primary_10_1016_j_neucom_2023_03_061 crossref_primary_10_1109_JSTARS_2025_3580718 crossref_primary_10_3390_electronics12173693 crossref_primary_10_1007_s44267_025_00083_0 crossref_primary_10_1016_j_neucom_2025_130650 crossref_primary_10_1007_s11042_023_14950_9 crossref_primary_10_1007_s11042_025_21095_4 crossref_primary_10_1109_TMM_2024_3368918 crossref_primary_10_3390_app14135391 crossref_primary_10_3390_rs17050919 crossref_primary_10_1007_s10489_025_06635_7 crossref_primary_10_1109_LGRS_2024_3368430 crossref_primary_10_1109_TGRS_2024_3458986 crossref_primary_10_1007_s00138_024_01601_8 crossref_primary_10_3390_electronics13173438 crossref_primary_10_1080_13682199_2025_2516862 crossref_primary_10_1016_j_infrared_2024_105252 crossref_primary_10_3390_rs16091525 crossref_primary_10_4218_etrij_2024_0037 crossref_primary_10_1117_1_JRS_18_046506 crossref_primary_10_1088_1361_6501_adea8d crossref_primary_10_1109_ACCESS_2025_3568027 crossref_primary_10_3390_s24165098 crossref_primary_10_1109_JAS_2024_124521 crossref_primary_10_1007_s00371_025_03885_x crossref_primary_10_1007_s00371_024_03611_z crossref_primary_10_3390_electronics12020299 crossref_primary_10_1109_JSTARS_2024_3417615 crossref_primary_10_3390_jimaging9080153 crossref_primary_10_4218_etrij_2024_0146 crossref_primary_10_1016_j_optlaseng_2024_108558 crossref_primary_10_1109_TIP_2025_3579148 crossref_primary_10_1016_j_jes_2024_04_037 crossref_primary_10_1007_s00371_025_04144_9 crossref_primary_10_1109_TGRS_2024_3506780 crossref_primary_10_3390_rs16244780 crossref_primary_10_3390_app15063191 crossref_primary_10_3390_electronics13112082 crossref_primary_10_1016_j_displa_2024_102904 crossref_primary_10_1007_s00530_025_01754_0 |
| Cites_doi | 10.1109/TIP.2015.2446191 10.1109/CVPR.2016.90 10.1109/CVPR.2017.632 10.1109/CVPRW53098.2021.00028 10.1109/WACV45572.2020.9093471 10.1109/LGRS.2020.3006533 10.1109/CVPR52688.2022.01167 10.1109/IGARSS.2002.1026134 10.1007/978-3-030-58577-8_12 10.1109/TIP.2016.2598681 10.1609/aaai.v36i3.20176 10.1109/CVPRW50498.2020.00230 10.1109/WACV48630.2021.00046 10.1145/1508044.1508113 10.1109/ICCV48922.2021.00041 10.1109/CVPR.2018.00343 10.1609/aaai.v34i07.6865 10.1109/CVPR46437.2021.01041 10.1007/978-3-030-01449-0_52 10.1109/CVPR52688.2022.00572 10.1007/978-3-319-46475-6_10 10.1007/978-3-540-74831-1_5 10.1109/CVPR42600.2020.00223 10.1007/978-3-030-58539-6_43 10.1109/ICCV48922.2021.00061 10.1109/WACV.2019.00151 10.1109/CVPRW53098.2021.00029 10.1145/2651362 10.1109/CVPR42600.2020.00288 10.1109/TIP.2018.2867951 10.5555/3104322.3104425 10.48550/ARXIV.1706.03762 10.1109/ICIP.2017.8296874 10.1016/0034-4257(88)90019-3 10.1109/CVPR52688.2022.01716 10.1109/ICCV48922.2021.00060 10.1109/CVPR.2019.00060 10.1109/CVPR.2017.195 10.1109/JSTARS.2018.2812726 10.1109/TGRS.2021.3131035 10.1109/ICCV48922.2021.00986 10.48550/ARXIV.1710.05941 10.1109/ACSSC.2003.1292216 10.48550/arXiv.1603.08155 10.1109/TGRS.2020.3004556 10.1109/TPAMI.2010.168 10.1109/ICIP.2016.7532754 10.1007/978-3-030-01264-9_8 10.1109/CVPR52688.2022.01181 10.1016/j.knosys.2021.107279 10.1109/tpami.2023.3341806 10.1109/CVPRW50498.2020.00223 10.1109/CVPR.2018.00337 10.1109/CVPR46437.2021.01592 10.1109/ICCVW54120.2021.00210 10.1109/CVPRW.2018.00119 10.5555/3045118.3045167 10.1007/978-3-031-19784-0_8 10.1007/978-3-319-24574-4_28 10.1109/ICCV.1999.790306 10.1109/ICIP.2019.8803046 10.1109/ICCV.2017.511 10.1109/TIP.2021.3108022 10.1109/CVPR.2016.185 10.1109/ICCV.2019.00741 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
| DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| DOI | 10.1109/TIP.2023.3256763 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitleList | Technology Research Database PubMed MEDLINE - Academic |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Applied Sciences Engineering |
| EISSN | 1941-0042 |
| EndPage | 1 |
| ExternalDocumentID | 37030760 10_1109_TIP_2023_3256763 10076399 |
| Genre | orig-research Journal Article |
| GroupedDBID | --- -~X .DC 0R~ 29I 4.4 5GY 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS F5P HZ~ IFIPE IPLJI JAVBF LAI M43 MS~ O9- OCL P2P RIA RIE RNS TAE TN5 53G 5VS AAYXX ABFSI AETIX AGSQL AI. AIBXA ALLEH CITATION E.L EJD H~9 ICLAB IFJZH VH1 AAYOK NPM RIG 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| ID | FETCH-LOGICAL-c348t-bf8d01ee996c8ac4d556a805ad84e3695c06dbd41f36ac442c5e8dbc4ddfffe73 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 554 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000967927100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 1057-7149 1941-0042 |
| IngestDate | Sun Nov 09 13:52:30 EST 2025 Mon Jun 30 10:20:11 EDT 2025 Sun Apr 06 01:21:18 EDT 2025 Sat Nov 29 03:34:41 EST 2025 Tue Nov 18 22:18:53 EST 2025 Wed Aug 27 02:25:56 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c348t-bf8d01ee996c8ac4d556a805ad84e3695c06dbd41f36ac442c5e8dbc4ddfffe73 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0002-7339-1335 0000-0001-9606-8402 0000-0002-6215-9733 0000-0003-0293-2656 |
| PMID | 37030760 |
| PQID | 2790138291 |
| PQPubID | 85429 |
| PageCount | 1 |
| ParticipantIDs | proquest_journals_2790138291 pubmed_primary_37030760 crossref_primary_10_1109_TIP_2023_3256763 proquest_miscellaneous_2798714149 ieee_primary_10076399 crossref_citationtrail_10_1109_TIP_2023_3256763 |
| PublicationCentury | 2000 |
| PublicationDate | 2023-01-01 |
| PublicationDateYYYYMMDD | 2023-01-01 |
| PublicationDate_xml | – month: 01 year: 2023 text: 2023-01-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States – name: New York |
| PublicationTitle | IEEE transactions on image processing |
| PublicationTitleAbbrev | TIP |
| PublicationTitleAlternate | IEEE Trans Image Process |
| PublicationYear | 2023 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref13 ref57 ref12 Hendrycks (ref43) 2016 ref56 ref15 ref59 Zhao (ref66) 2021 ref14 ref58 ref53 Loshchilov (ref77) ref52 ref11 ref55 ref10 ref54 Dosovitskiy (ref60) ref17 ref16 ref19 ref18 Choromanski (ref26) Lin (ref35) 2021 ref51 Yu (ref33); 34 ref50 Huang (ref32) 2021 Chen (ref36) 2021 ref46 ref45 ref89 ref48 ref47 ref85 ref44 ref88 Maas (ref72) ref87 Tan (ref70) ref49 ref8 Li (ref29) ref7 ref9 ref4 ref3 ref6 ref5 McCartney (ref1) 1976 ref82 ref81 ref40 ref84 ref83 ref80 ref79 ref34 ref78 ref37 ref31 Dai (ref28); 34 ref74 Mei (ref86) 2020 ref2 ref39 Yang (ref41); 34 Park (ref30) ref71 Ba (ref42) 2016 Goyal (ref75) 2017 ref24 Chu (ref38); 34 ref68 ref23 ref67 ref25 ref69 ref20 ref64 ref63 ref22 ref21 Loshchilov (ref76) Li (ref65) 2021 Islam (ref73) Xiao (ref27); 34 ref62 ref61 |
| References_xml | – ident: ref6 doi: 10.1109/TIP.2015.2446191 – ident: ref68 doi: 10.1109/CVPR.2016.90 – ident: ref85 doi: 10.1109/CVPR.2017.632 – ident: ref88 doi: 10.1109/CVPRW53098.2021.00028 – start-page: 1 volume-title: Proc. Int. Conf. Learn. Represent. ident: ref26 article-title: Rethinking attention with performers – ident: ref58 doi: 10.1109/WACV45572.2020.9093471 – start-page: 6105 volume-title: Proc. Int. Conf. Mach. Learn. ident: ref70 article-title: EfficientNet: Rethinking model scaling for convolutional neural networks – ident: ref78 doi: 10.1109/LGRS.2020.3006533 – ident: ref71 doi: 10.1109/CVPR52688.2022.01167 – ident: ref81 doi: 10.1109/IGARSS.2002.1026134 – ident: ref15 doi: 10.1007/978-3-030-58577-8_12 – ident: ref8 doi: 10.1109/TIP.2016.2598681 – ident: ref40 doi: 10.1609/aaai.v36i3.20176 – year: 2016 ident: ref42 article-title: Layer normalization publication-title: arXiv:1607.06450 – ident: ref47 doi: 10.1109/CVPRW50498.2020.00230 – ident: ref59 doi: 10.1109/WACV48630.2021.00046 – ident: ref3 doi: 10.1145/1508044.1508113 – volume: 34 start-page: 3965 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref28 article-title: CoAtNet: Marrying convolution and attention for all data sizes – ident: ref37 doi: 10.1109/ICCV48922.2021.00041 – volume: 34 start-page: 30008 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref41 article-title: Focal attention for long-range interactions in vision transformers – year: 2021 ident: ref66 article-title: Complementary feature enhanced network with vision transformer for image dehazing publication-title: arXiv:2109.07100 – ident: ref12 doi: 10.1109/CVPR.2018.00343 – year: 2017 ident: ref75 article-title: Accurate, large minibatch SGD: Training ImageNet in 1 hour publication-title: arXiv:1706.02677 – ident: ref18 doi: 10.1609/aaai.v34i07.6865 – ident: ref19 doi: 10.1109/CVPR46437.2021.01041 – start-page: 1 volume-title: Proc. Int. Conf. Learn. Represent. ident: ref30 article-title: How do vision transformers work? – ident: ref50 doi: 10.1007/978-3-030-01449-0_52 – start-page: 1 volume-title: Proc. Int. Conf. Learn. Represent. ident: ref77 article-title: SGDR: Stochastic gradient descent with warm restarts – ident: ref67 doi: 10.1109/CVPR52688.2022.00572 – ident: ref9 doi: 10.1007/978-3-319-46475-6_10 – start-page: 1 volume-title: Proc. Int. Conf. Learn. Represent. ident: ref76 article-title: Decoupled weight decay regularization – ident: ref82 doi: 10.1007/978-3-540-74831-1_5 – ident: ref17 doi: 10.1109/CVPR42600.2020.00223 – ident: ref16 doi: 10.1007/978-3-030-58539-6_43 – ident: ref24 doi: 10.1109/ICCV48922.2021.00061 – ident: ref14 doi: 10.1109/WACV.2019.00151 – volume: 34 start-page: 30392 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref27 article-title: Early convolutions help transformers see better – ident: ref89 doi: 10.1109/CVPRW53098.2021.00029 – ident: ref5 doi: 10.1145/2651362 – ident: ref22 doi: 10.1109/CVPR42600.2020.00288 – ident: ref55 doi: 10.1109/TIP.2018.2867951 – start-page: 1 volume-title: Proc. Int. Conf. Learn. Represent. ident: ref73 article-title: How much position information do convolutional neural networks encode? – ident: ref45 doi: 10.5555/3104322.3104425 – ident: ref23 doi: 10.48550/ARXIV.1706.03762 – ident: ref54 doi: 10.1109/ICIP.2017.8296874 – ident: ref80 doi: 10.1016/0034-4257(88)90019-3 – year: 2016 ident: ref43 article-title: Gaussian error linear units (GELUs) publication-title: arXiv:1606.08415 – year: 2020 ident: ref86 article-title: Pyramid attention networks for image restoration publication-title: arXiv:2004.13824 – ident: ref63 doi: 10.1109/CVPR52688.2022.01716 – year: 2021 ident: ref35 article-title: CAT: Cross attention in vision transformer publication-title: arXiv:2106.05786 – year: 2021 ident: ref36 article-title: RegionViT: Regional-to-local attention for vision transformers publication-title: arXiv:2106.02689 – ident: ref25 doi: 10.1109/ICCV48922.2021.00060 – start-page: 1 volume-title: Proc. Int. Conf. Mach. Learn. ident: ref72 article-title: Rectifier nonlinearities improve neural network acoustic models – ident: ref46 doi: 10.1109/CVPR.2019.00060 – ident: ref64 doi: 10.1109/CVPR.2017.195 – year: 2021 ident: ref65 article-title: LocalViT: Bringing locality to vision transformers publication-title: arXiv:2104.05707 – ident: ref56 doi: 10.1109/JSTARS.2018.2812726 – volume: 34 start-page: 12992 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref33 article-title: Glance-and-gaze vision transformer – ident: ref79 doi: 10.1109/TGRS.2021.3131035 – ident: ref31 doi: 10.1109/ICCV48922.2021.00986 – volume: 34 start-page: 9355 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref38 article-title: Twins: Revisiting the design of spatial attention in vision transformers – ident: ref44 doi: 10.48550/ARXIV.1710.05941 – ident: ref83 doi: 10.1109/ACSSC.2003.1292216 – ident: ref84 doi: 10.48550/arXiv.1603.08155 – ident: ref57 doi: 10.1109/TGRS.2020.3004556 – start-page: 408 volume-title: Optics of the Atmosphere: Scattering by Molecules and Particles year: 1976 ident: ref1 – ident: ref4 doi: 10.1109/TPAMI.2010.168 – ident: ref53 doi: 10.1109/ICIP.2016.7532754 – ident: ref74 doi: 10.1007/978-3-030-01264-9_8 – ident: ref39 doi: 10.1109/CVPR52688.2022.01181 – ident: ref20 doi: 10.1016/j.knosys.2021.107279 – ident: ref34 doi: 10.1109/tpami.2023.3341806 – year: 2021 ident: ref32 article-title: Shuffle transformer: Rethinking spatial shuffle for vision transformer publication-title: arXiv:2106.03650 – start-page: 1 volume-title: Proc. Int. Conf. Learn. Represent. ident: ref60 article-title: An image is worth 16×16 words: Transformers for image recognition at scale – ident: ref87 doi: 10.1109/CVPRW50498.2020.00223 – ident: ref10 doi: 10.1109/CVPR.2018.00337 – ident: ref49 doi: 10.1109/CVPR46437.2021.01592 – start-page: 1 volume-title: Proc. Int. Conf. Learn. Represent. ident: ref29 article-title: UniFormer: Unified transformer for efficient spatiotemporal representation learning – ident: ref62 doi: 10.1109/ICCVW54120.2021.00210 – ident: ref51 doi: 10.1109/CVPRW.2018.00119 – ident: ref69 doi: 10.5555/3045118.3045167 – ident: ref48 doi: 10.1007/978-3-031-19784-0_8 – ident: ref61 doi: 10.1007/978-3-319-24574-4_28 – ident: ref2 doi: 10.1109/ICCV.1999.790306 – ident: ref52 doi: 10.1109/ICIP.2019.8803046 – ident: ref11 doi: 10.1109/ICCV.2017.511 – ident: ref21 doi: 10.1109/TIP.2021.3108022 – ident: ref7 doi: 10.1109/CVPR.2016.185 – ident: ref13 doi: 10.1109/ICCV.2019.00741 |
| SSID | ssj0014516 |
| Score | 2.757391 |
| Snippet | Image dehazing is a representative low-level vision task that estimates latent haze-free images from hazy images. In recent years, convolutional neural... |
| SourceID | proquest pubmed crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 1 |
| SubjectTerms | Artificial neural networks Datasets Deep Learning Image Dehazing Image Processing Remote sensing Spatial data Vision Transformer |
| Title | Vision Transformers for Single Image Dehazing |
| URI | https://ieeexplore.ieee.org/document/10076399 https://www.ncbi.nlm.nih.gov/pubmed/37030760 https://www.proquest.com/docview/2790138291 https://www.proquest.com/docview/2798714149 |
| Volume | 32 |
| WOSCitedRecordID | wos000967927100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 1941-0042 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014516 issn: 1057-7149 databaseCode: RIE dateStart: 19920101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fa9swED6a0of2YdnSdnObBQ_60gcltiVb0mPpVhYopdCs5M3Y0mkbdMnIj_39PclOyEsGfbJBZ1ucT77v9PnuAK4o0kkxSxwjZ2yYqHLNtBEp47oQWjmDaWDPn-_lw4OaTvVjm6wecmEQMfx8hkN_Grh8Ozdrv1U28oy-96gd6Egpm2StLWXgO84GajOXTBLu33CSiR5Nxo9D3yZ8yMnBh4qfOz4oNFXZjy-Dn7nrvnGG7-FdCyjjm8YCPsABznrQbcFl3C7dZQ9OdioPngJ7Djnl8WSDWwkFxnSMn2j4BePxH_rOxF_xl68-_fMMftx9m9x-Z23nBGa4UCtWO2WTFJGCGaMqI2yeF5VK8soqgbzQuUkKW1uROl7QsMhMjsrWJGidcyj5ORzO5jP8BHGRaUJ00lhHjgxNXXFnFed1ZblCdDKC0UaXpWnLivvuFi9lCC8SXZL2S6_9stV-BNfbK_42JTX-I3vmlbwj1-g3gv7mfZXtoluWmdSed810GsGX7TAtF8-BVDOcr4MMhYiC7COCj8173t6ch-JoRXKx56GXcOzn1mzA9OFwtVjjZzgy_1a_l4sB2eRUDYJNvgK23tuM |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT-MwEB6xsNLCgfcjPJasxIWD2yR2EvuIeIhqS4W0XcQtSuwxIEGLaMvvZ-ykVS8gcUokTxJrbGe-8eeZATghTyfGJLKMjLFmokwVU1rEjKtMKGk1xp49v-vmvZ68v1e3TbC6j4VBRH_4DFvu1nP5Zqgnbqus7Rh9Z1F_wFIqRBLX4Voz0sDVnPXkZpqznJD_lJWMVLvfuW25QuEtTibe5_ycs0K-rMrnCNNbmqu1b_ZxHVYbSBme1XNgAxZwsAlrDbwMm8U72oSVudyDW8DufFR52J8iV8KBIV3Df9T8jGHnhf404QU-uvzTD9vw_-qyf37NmtoJTHMhx6yy0kQxIrkzWpZamDTNShmlpZECeaZSHWWmMiK2PKNmkegUpalI0FhrMec7sDgYDnAPwixRhOlybSyZMtRVya2RnFel4RLR5gG0p7osdJNY3NW3eC68gxGpgrRfOO0XjfYDOJ098Von1fhCdtspeU6u1m8Ah9PxKpplNyqSXDnmNVFxAH9mzbRgHAtSDnA48TLkJAqaHwHs1uM8ezn36dGyaP-Tjx7Dr-v-Tbfodnp_D2DZ9bPejjmExfHbBI_gp34fP43efvuZ-QEcct3r |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Vision+Transformers+for+Single+Image+Dehazing&rft.jtitle=IEEE+transactions+on+image+processing&rft.au=Song%2C+Yuda&rft.au=He%2C+Zhuqing&rft.au=Qian%2C+Hui&rft.au=Du%2C+Xin&rft.date=2023-01-01&rft.eissn=1941-0042&rft.volume=32&rft.spage=1927&rft_id=info:doi/10.1109%2FTIP.2023.3256763&rft_id=info%3Apmid%2F37030760&rft.externalDocID=37030760 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1057-7149&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1057-7149&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1057-7149&client=summon |