Underwater Image Enhancement via Minimal Color Loss and Locally Adaptive Contrast Enhancement
Underwater images typically suffer from color deviations and low visibility due to the wavelength-dependent light absorption and scattering. To deal with these degradation issues, we propose an efficient and robust underwater image enhancement method, called MLLE. Specifically, we first locally adju...
Uloženo v:
| Vydáno v: | IEEE transactions on image processing Ročník 31; s. 3997 - 4010 |
|---|---|
| Hlavní autoři: | , , , , , |
| Médium: | Journal Article |
| Jazyk: | angličtina |
| Vydáno: |
United States
IEEE
01.01.2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Témata: | |
| ISSN: | 1057-7149, 1941-0042, 1941-0042 |
| On-line přístup: | Získat plný text |
| Tagy: |
Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
|
| Abstract | Underwater images typically suffer from color deviations and low visibility due to the wavelength-dependent light absorption and scattering. To deal with these degradation issues, we propose an efficient and robust underwater image enhancement method, called MLLE. Specifically, we first locally adjust the color and details of an input image according to a minimum color loss principle and a maximum attenuation map-guided fusion strategy. Afterward, we employ the integral and squared integral maps to compute the mean and variance of local image blocks, which are used to adaptively adjust the contrast of the input image. Meanwhile, a color balance strategy is introduced to balance the color differences between channel a and channel b in the CIELAB color space. Our enhanced results are characterized by vivid color, improved contrast, and enhanced details. Extensive experiments on three underwater image enhancement datasets demonstrate that our method outperforms the state-of-the-art methods. Our method is also appealing in its fast processing speed within 1s for processing an image of size <inline-formula> <tex-math notation="LaTeX">1024\times 1024 \times 3 </tex-math></inline-formula> on a single CPU. Experiments further suggest that our method can effectively improve the performance of underwater image segmentation, keypoint detection, and saliency detection. The project page is available at https://li-chongyi.github.io/proj |
|---|---|
| AbstractList | Underwater images typically suffer from color deviations and low visibility due to the wavelength-dependent light absorption and scattering. To deal with these degradation issues, we propose an efficient and robust underwater image enhancement method, called MLLE. Specifically, we first locally adjust the color and details of an input image according to a minimum color loss principle and a maximum attenuation map-guided fusion strategy. Afterward, we employ the integral and squared integral maps to compute the mean and variance of local image blocks, which are used to adaptively adjust the contrast of the input image. Meanwhile, a color balance strategy is introduced to balance the color differences between channel a and channel b in the CIELAB color space. Our enhanced results are characterized by vivid color, improved contrast, and enhanced details. Extensive experiments on three underwater image enhancement datasets demonstrate that our method outperforms the state-of-the-art methods. Our method is also appealing in its fast processing speed within 1s for processing an image of size 1024×1024×3 on a single CPU. Experiments further suggest that our method can effectively improve the performance of underwater image segmentation, keypoint detection, and saliency detection. The project page is available at https://li-chongyi.github.io/proj_MMLE.html. Underwater images typically suffer from color deviations and low visibility due to the wavelength-dependent light absorption and scattering. To deal with these degradation issues, we propose an efficient and robust underwater image enhancement method, called MLLE. Specifically, we first locally adjust the color and details of an input image according to a minimum color loss principle and a maximum attenuation map-guided fusion strategy. Afterward, we employ the integral and squared integral maps to compute the mean and variance of local image blocks, which are used to adaptively adjust the contrast of the input image. Meanwhile, a color balance strategy is introduced to balance the color differences between channel a and channel b in the CIELAB color space. Our enhanced results are characterized by vivid color, improved contrast, and enhanced details. Extensive experiments on three underwater image enhancement datasets demonstrate that our method outperforms the state-of-the-art methods. Our method is also appealing in its fast processing speed within 1s for processing an image of size 1024×1024×3 on a single CPU. Experiments further suggest that our method can effectively improve the performance of underwater image segmentation, keypoint detection, and saliency detection. The project page is available at https://li-chongyi.github.io/proj_MMLE.html.Underwater images typically suffer from color deviations and low visibility due to the wavelength-dependent light absorption and scattering. To deal with these degradation issues, we propose an efficient and robust underwater image enhancement method, called MLLE. Specifically, we first locally adjust the color and details of an input image according to a minimum color loss principle and a maximum attenuation map-guided fusion strategy. Afterward, we employ the integral and squared integral maps to compute the mean and variance of local image blocks, which are used to adaptively adjust the contrast of the input image. Meanwhile, a color balance strategy is introduced to balance the color differences between channel a and channel b in the CIELAB color space. Our enhanced results are characterized by vivid color, improved contrast, and enhanced details. Extensive experiments on three underwater image enhancement datasets demonstrate that our method outperforms the state-of-the-art methods. Our method is also appealing in its fast processing speed within 1s for processing an image of size 1024×1024×3 on a single CPU. Experiments further suggest that our method can effectively improve the performance of underwater image segmentation, keypoint detection, and saliency detection. The project page is available at https://li-chongyi.github.io/proj_MMLE.html. Underwater images typically suffer from color deviations and low visibility due to the wavelength-dependent light absorption and scattering. To deal with these degradation issues, we propose an efficient and robust underwater image enhancement method, called MLLE. Specifically, we first locally adjust the color and details of an input image according to a minimum color loss principle and a maximum attenuation map-guided fusion strategy. Afterward, we employ the integral and squared integral maps to compute the mean and variance of local image blocks, which are used to adaptively adjust the contrast of the input image. Meanwhile, a color balance strategy is introduced to balance the color differences between channel a and channel b in the CIELAB color space. Our enhanced results are characterized by vivid color, improved contrast, and enhanced details. Extensive experiments on three underwater image enhancement datasets demonstrate that our method outperforms the state-of-the-art methods. Our method is also appealing in its fast processing speed within 1s for processing an image of size [Formula Omitted] on a single CPU. Experiments further suggest that our method can effectively improve the performance of underwater image segmentation, keypoint detection, and saliency detection. The project page is available at https://li-chongyi.github.io/proj Underwater images typically suffer from color deviations and low visibility due to the wavelength-dependent light absorption and scattering. To deal with these degradation issues, we propose an efficient and robust underwater image enhancement method, called MLLE. Specifically, we first locally adjust the color and details of an input image according to a minimum color loss principle and a maximum attenuation map-guided fusion strategy. Afterward, we employ the integral and squared integral maps to compute the mean and variance of local image blocks, which are used to adaptively adjust the contrast of the input image. Meanwhile, a color balance strategy is introduced to balance the color differences between channel a and channel b in the CIELAB color space. Our enhanced results are characterized by vivid color, improved contrast, and enhanced details. Extensive experiments on three underwater image enhancement datasets demonstrate that our method outperforms the state-of-the-art methods. Our method is also appealing in its fast processing speed within 1s for processing an image of size <inline-formula> <tex-math notation="LaTeX">1024\times 1024 \times 3 </tex-math></inline-formula> on a single CPU. Experiments further suggest that our method can effectively improve the performance of underwater image segmentation, keypoint detection, and saliency detection. The project page is available at https://li-chongyi.github.io/proj |
| Author | Sun, Hai-Han Li, Guohou Zhang, Weidong Li, Chongyi Kwong, Sam Zhuang, Peixian |
| Author_xml | – sequence: 1 givenname: Weidong orcidid: 0000-0003-2495-4469 surname: Zhang fullname: Zhang, Weidong email: zwd_wd@163.com organization: School of Information Engineering, Henan Institute of Science and Technology, Xinxiang, China – sequence: 2 givenname: Peixian orcidid: 0000-0002-7143-9569 surname: Zhuang fullname: Zhuang, Peixian email: zhuangpeixian0624@163.com organization: Department of Automation, Tsinghua University, Beijing, China – sequence: 3 givenname: Hai-Han orcidid: 0000-0003-2749-9916 surname: Sun fullname: Sun, Hai-Han email: hannah.h.sun@outlook.com organization: School of Mechanical and Aerospace Engineering, Nanyang Technological University, Singapore – sequence: 4 givenname: Guohou surname: Li fullname: Li, Guohou email: liguohou6@163.com organization: School of Information Engineering, Henan Institute of Science and Technology, Xinxiang, China – sequence: 5 givenname: Sam orcidid: 0000-0001-7484-7261 surname: Kwong fullname: Kwong, Sam email: cssamk@cityu.edu.hk organization: Department of Computer Science, City University of Hong Kong, Hong Kong, SAR, China – sequence: 6 givenname: Chongyi orcidid: 0000-0003-2609-2460 surname: Li fullname: Li, Chongyi email: lichongyi25@gmail.com organization: School of Computer Science and Engineering, Nanyang Technological University, Singapore |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35657839$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kc1LKzEUxYP48HsvCDLgxs3UfEwmk6UU37NQeS50KSFN7mhkJlOTtOJ_b2qriAtX98L9ncPlnH207QcPCB0TPCIEy4u7ye2IYkpHjAhBqNxCe0RWpMS4ott5x1yUglRyF-3H-IwxqTipd9Au4zUXDZN76OHeWwivOkEoJr1-hOLKP2lvoAefiqXTxY3zrtddMR66IRTTIcZCe5sXo7vurbi0ep7cEvLdp6Bj-m5wiP60uotwtJkH6P7v1d34upz-_zcZX05LwyqRSm2aWgjDDdBZU2NsAbeNBdbOGGsY5xazGReCMiqkodJqoqkFW7dWsnwDdoDO177zMLwsICbVu2ig67SHYREVrQVjvBGUZvTsB_o8LILP360ojiuZQ8vU6YZazHqwah5yBuFNfQaXgXoNmJATCdAq45JO7iME1ymC1aohlRtSq4bUpqEsxD-En96_SE7WEgcAX7gUTcMZZ--P-JpX |
| CODEN | IIPRE4 |
| CitedBy_id | crossref_primary_10_1016_j_jvcir_2024_104355 crossref_primary_10_3389_fmars_2024_1471014 crossref_primary_10_1016_j_optcom_2023_130064 crossref_primary_10_1109_TCYB_2023_3326165 crossref_primary_10_1007_s42452_025_07163_2 crossref_primary_10_1007_s11760_023_02591_2 crossref_primary_10_1016_j_engappai_2023_107219 crossref_primary_10_1109_LGRS_2023_3344630 crossref_primary_10_1016_j_engappai_2023_107462 crossref_primary_10_1007_s10846_024_02065_8 crossref_primary_10_1109_TIM_2024_3366583 crossref_primary_10_1016_j_eswa_2024_125350 crossref_primary_10_1016_j_knosys_2024_112049 crossref_primary_10_1109_TGRS_2024_3485030 crossref_primary_10_3389_fmars_2024_1457190 crossref_primary_10_1016_j_jvcir_2024_104240 crossref_primary_10_1016_j_optlaseng_2025_109241 crossref_primary_10_1109_JIOT_2025_3540033 crossref_primary_10_1016_j_eswa_2024_126314 crossref_primary_10_1016_j_optlaseng_2025_109363 crossref_primary_10_1016_j_patcog_2024_111198 crossref_primary_10_1109_TCSVT_2024_3508102 crossref_primary_10_1016_j_jvcir_2025_104500 crossref_primary_10_3390_jmse13061195 crossref_primary_10_1016_j_asoc_2025_112865 crossref_primary_10_1016_j_image_2024_117200 crossref_primary_10_1109_ACCESS_2022_3221407 crossref_primary_10_1109_TGRS_2023_3347745 crossref_primary_10_3390_jmse11122228 crossref_primary_10_3390_jmse13091820 crossref_primary_10_1016_j_procs_2025_04_302 crossref_primary_10_3390_jmse12071208 crossref_primary_10_1109_LSP_2024_3470752 crossref_primary_10_1007_s00034_025_03220_8 crossref_primary_10_1109_TCSVT_2024_3512600 crossref_primary_10_1016_j_neucom_2025_130845 crossref_primary_10_1109_JIOT_2023_3320739 crossref_primary_10_3389_fmars_2023_1163831 crossref_primary_10_1016_j_jvcir_2024_104131 crossref_primary_10_1049_ipr2_70037 crossref_primary_10_1109_TGRS_2023_3293912 crossref_primary_10_1007_s00530_024_01642_z crossref_primary_10_3390_app15010090 crossref_primary_10_1007_s00371_025_03989_4 crossref_primary_10_1109_TCSVT_2025_3545595 crossref_primary_10_1109_JOE_2023_3310079 crossref_primary_10_1109_TMM_2023_3327613 crossref_primary_10_1016_j_displa_2025_103094 crossref_primary_10_1109_TCSVT_2023_3289566 crossref_primary_10_1016_j_inffus_2025_103177 crossref_primary_10_1016_j_knosys_2022_109997 crossref_primary_10_1016_j_jvcir_2024_104145 crossref_primary_10_1016_j_engappai_2023_107445 crossref_primary_10_1088_1361_6501_adce20 crossref_primary_10_1016_j_optlastec_2025_112616 crossref_primary_10_3390_s23104688 crossref_primary_10_1007_s13042_024_02379_x crossref_primary_10_1016_j_knosys_2022_109751 crossref_primary_10_1007_s11760_025_04086_8 crossref_primary_10_1016_j_sigpro_2025_110165 crossref_primary_10_1016_j_imavis_2024_104995 crossref_primary_10_1109_TGRS_2024_3387722 crossref_primary_10_1016_j_asoc_2025_113811 crossref_primary_10_1080_19479832_2024_2423788 crossref_primary_10_3390_s23187763 crossref_primary_10_1016_j_optlaseng_2025_109102 crossref_primary_10_3390_electronics13010196 crossref_primary_10_3390_electronics13010199 crossref_primary_10_1016_j_engappai_2025_112203 crossref_primary_10_1109_TGRS_2023_3338611 crossref_primary_10_1109_TCI_2025_3544065 crossref_primary_10_1016_j_eswa_2024_125271 crossref_primary_10_1109_ACCESS_2024_3478788 crossref_primary_10_1016_j_neucom_2024_129274 crossref_primary_10_1016_j_knosys_2025_114420 crossref_primary_10_1109_TCSVT_2025_3533598 crossref_primary_10_1007_s44295_023_00015_y crossref_primary_10_1016_j_eswa_2025_128959 crossref_primary_10_1016_j_engappai_2023_106457 crossref_primary_10_1016_j_patcog_2025_111411 crossref_primary_10_1109_LSP_2023_3255005 crossref_primary_10_3390_s24123794 crossref_primary_10_1016_j_displa_2024_102845 crossref_primary_10_1016_j_patrec_2024_11_007 crossref_primary_10_1016_j_compeleceng_2022_108177 crossref_primary_10_3390_s23073693 crossref_primary_10_1016_j_jvcir_2024_104051 crossref_primary_10_1016_j_optlaseng_2025_109321 crossref_primary_10_1109_TAI_2024_3508667 crossref_primary_10_1007_s00530_025_01811_8 crossref_primary_10_1088_1361_6501_ad9941 crossref_primary_10_1109_TCSVT_2023_3299314 crossref_primary_10_1016_j_patrec_2025_07_026 crossref_primary_10_1109_JOE_2023_3245760 crossref_primary_10_3390_electronics13245003 crossref_primary_10_1109_ACCESS_2024_3400533 crossref_primary_10_1016_j_image_2024_117154 crossref_primary_10_3390_s23187714 crossref_primary_10_1016_j_dsp_2024_104828 crossref_primary_10_1007_s10489_023_04624_2 crossref_primary_10_1109_LSP_2024_3405909 crossref_primary_10_1109_JOE_2024_3478315 crossref_primary_10_1007_s00371_024_03630_w crossref_primary_10_1016_j_compeleceng_2025_110361 crossref_primary_10_3389_fmars_2023_1112310 crossref_primary_10_3389_fmars_2025_1555128 crossref_primary_10_3390_electronics12183876 crossref_primary_10_1049_ipr2_70184 crossref_primary_10_1109_TCSVT_2023_3328272 crossref_primary_10_1080_1206212X_2025_2466197 crossref_primary_10_1007_s12145_024_01620_z crossref_primary_10_1109_ACCESS_2022_3227046 crossref_primary_10_1007_s12145_024_01226_5 crossref_primary_10_1364_AO_557280 crossref_primary_10_1109_TMM_2025_3535308 crossref_primary_10_1016_j_inffus_2025_103203 crossref_primary_10_1016_j_inffus_2025_103566 crossref_primary_10_1109_TIP_2025_3587578 crossref_primary_10_1109_LGRS_2024_3397866 crossref_primary_10_1109_TGRS_2025_3561927 crossref_primary_10_1109_TGRS_2023_3285228 crossref_primary_10_1016_j_image_2025_117332 crossref_primary_10_3389_fmars_2024_1321549 crossref_primary_10_1016_j_eswa_2025_126561 crossref_primary_10_1016_j_engappai_2025_111677 crossref_primary_10_1109_TGRS_2024_3477911 crossref_primary_10_3389_fpls_2023_1117478 crossref_primary_10_1016_j_engappai_2023_106532 crossref_primary_10_1002_rob_22525 crossref_primary_10_3390_app15020641 crossref_primary_10_1007_s00371_025_03866_0 crossref_primary_10_3390_photonics9090642 crossref_primary_10_1109_TGRS_2025_3526927 crossref_primary_10_1145_3578584 crossref_primary_10_1016_j_compeleceng_2025_110228 crossref_primary_10_1109_TCSVT_2024_3378252 crossref_primary_10_1109_TGRS_2025_3525962 crossref_primary_10_1109_TMM_2024_3414277 crossref_primary_10_1016_j_oceaneng_2025_120896 crossref_primary_10_1007_s11760_025_03874_6 crossref_primary_10_1007_s00371_023_03117_0 crossref_primary_10_1109_TCSVT_2025_3556203 crossref_primary_10_3390_app15031649 crossref_primary_10_1016_j_compeleceng_2025_110469 crossref_primary_10_1016_j_engappai_2023_106743 crossref_primary_10_1109_TETCI_2024_3369321 crossref_primary_10_1109_JSEN_2024_3409435 crossref_primary_10_1109_TCSVT_2025_3525593 crossref_primary_10_1007_s00371_025_03947_0 crossref_primary_10_1016_j_isprsjprs_2024_06_019 crossref_primary_10_1016_j_imavis_2025_105734 crossref_primary_10_1007_s00371_025_04053_x crossref_primary_10_1016_j_imavis_2024_105217 crossref_primary_10_3389_fmars_2025_1544839 crossref_primary_10_1016_j_knosys_2022_110041 crossref_primary_10_1007_s10489_024_06224_0 crossref_primary_10_1109_TPAMI_2024_3403234 crossref_primary_10_3390_app13169419 crossref_primary_10_1016_j_dsp_2025_105048 crossref_primary_10_1016_j_jvcir_2024_104308 crossref_primary_10_1016_j_dsp_2025_105170 crossref_primary_10_1016_j_isprsjprs_2024_06_009 crossref_primary_10_3390_s24092684 crossref_primary_10_1080_13682199_2023_2260663 crossref_primary_10_1016_j_inffus_2025_103365 crossref_primary_10_1109_TCSVT_2024_3451553 crossref_primary_10_1016_j_neunet_2025_107660 crossref_primary_10_1016_j_optlaseng_2025_109171 crossref_primary_10_1016_j_neucom_2024_127926 crossref_primary_10_3390_sym17050725 crossref_primary_10_1007_s00530_025_01936_w crossref_primary_10_1007_s11760_025_04014_w crossref_primary_10_1007_s11760_022_02435_5 crossref_primary_10_1007_s12083_023_01480_2 crossref_primary_10_1016_j_image_2025_117370 crossref_primary_10_1364_OE_562111 crossref_primary_10_3390_app131810176 crossref_primary_10_1109_ACCESS_2023_3289591 crossref_primary_10_1016_j_jvcir_2023_104021 crossref_primary_10_1007_s11760_023_02562_7 crossref_primary_10_1016_j_engappai_2023_106731 crossref_primary_10_1109_TCE_2024_3476033 crossref_primary_10_1109_JOE_2024_3458348 crossref_primary_10_1109_JOE_2024_3458349 crossref_primary_10_1109_TGRS_2024_3373904 crossref_primary_10_1016_j_dsp_2025_105341 crossref_primary_10_1109_TCSVT_2023_3297524 crossref_primary_10_1109_JOE_2024_3458109 crossref_primary_10_1109_ACCESS_2023_3239816 crossref_primary_10_1109_TGRS_2024_3358892 crossref_primary_10_1016_j_image_2025_117266 crossref_primary_10_1016_j_image_2025_117264 crossref_primary_10_1109_TCSVT_2023_3253898 crossref_primary_10_1109_TCSVT_2024_3466925 crossref_primary_10_1109_LSP_2023_3310152 crossref_primary_10_1109_TIP_2024_3457246 crossref_primary_10_1016_j_eswa_2024_126075 crossref_primary_10_1080_01431161_2025_2555280 crossref_primary_10_3390_s25082574 crossref_primary_10_1016_j_compag_2022_107426 crossref_primary_10_1016_j_eswa_2023_122693 crossref_primary_10_1109_TGRS_2022_3227548 crossref_primary_10_1109_TIM_2024_3396850 crossref_primary_10_1109_TGRS_2024_3451471 crossref_primary_10_1016_j_patcog_2025_111840 crossref_primary_10_3390_s25175439 crossref_primary_10_1080_00051144_2024_2404365 crossref_primary_10_1088_1402_4896_ade29a crossref_primary_10_1016_j_infrared_2023_104761 crossref_primary_10_1016_j_imavis_2024_105256 crossref_primary_10_1038_s41598_025_87965_x crossref_primary_10_1016_j_eswa_2023_122546 crossref_primary_10_1016_j_image_2025_117271 crossref_primary_10_1109_TGRS_2024_3522685 crossref_primary_10_3390_s23156733 crossref_primary_10_1016_j_isprsjprs_2024_07_013 crossref_primary_10_3390_electronics14122411 crossref_primary_10_1007_s11760_023_02718_5 crossref_primary_10_1016_j_optlastec_2025_113195 crossref_primary_10_1007_s00138_024_01651_y crossref_primary_10_1016_j_optlaseng_2025_109157 crossref_primary_10_3390_math12223553 crossref_primary_10_1109_ACCESS_2024_3435569 crossref_primary_10_3390_jmse13061092 crossref_primary_10_1016_j_jvcir_2024_104224 crossref_primary_10_3390_electronics13122313 crossref_primary_10_1111_coin_70095 crossref_primary_10_1016_j_sigpro_2024_109408 crossref_primary_10_3390_app15095200 crossref_primary_10_1016_j_cviu_2024_104251 crossref_primary_10_1016_j_dsp_2025_105372 crossref_primary_10_1038_s41598_024_82803_y crossref_primary_10_3389_fmars_2025_1523729 crossref_primary_10_3390_math12091366 crossref_primary_10_1016_j_image_2025_117281 crossref_primary_10_1016_j_imavis_2024_105361 crossref_primary_10_1016_j_optlaseng_2025_109144 crossref_primary_10_1109_TCSVT_2022_3225376 crossref_primary_10_1371_journal_pone_0299110 crossref_primary_10_1016_j_engappai_2023_105952 crossref_primary_10_1007_s11263_024_01987_y crossref_primary_10_1109_ACCESS_2023_3328171 crossref_primary_10_3390_rs15143623 crossref_primary_10_1109_TIM_2023_3298395 crossref_primary_10_1016_j_compeleceng_2024_109471 crossref_primary_10_1016_j_knosys_2023_110263 crossref_primary_10_3390_electronics14061203 crossref_primary_10_1016_j_eswa_2024_124932 crossref_primary_10_1109_ACCESS_2022_3213340 crossref_primary_10_1109_JOE_2024_3463840 crossref_primary_10_1109_TGRS_2023_3339216 crossref_primary_10_1080_14498596_2024_2383881 crossref_primary_10_1016_j_inffus_2025_102977 crossref_primary_10_3390_math13152535 crossref_primary_10_1007_s10489_024_05318_z crossref_primary_10_1109_JOE_2023_3306591 crossref_primary_10_1016_j_optlaseng_2024_108575 crossref_primary_10_1049_ipr2_12750 crossref_primary_10_1051_epjconf_202532801027 crossref_primary_10_1109_ACCESS_2024_3370597 crossref_primary_10_1109_TIM_2025_3551931 crossref_primary_10_1109_TGRS_2025_3598353 crossref_primary_10_1016_j_optlaseng_2025_108826 crossref_primary_10_1109_JOE_2024_3458351 crossref_primary_10_3389_fmars_2023_1132500 crossref_primary_10_1007_s00034_025_03326_z crossref_primary_10_1016_j_sigpro_2024_109690 crossref_primary_10_3390_app14020923 crossref_primary_10_1109_JOE_2024_3428624 crossref_primary_10_1109_ACCESS_2023_3335618 crossref_primary_10_1109_TCSVT_2024_3482548 crossref_primary_10_3389_fmars_2024_1301072 crossref_primary_10_1016_j_measurement_2025_117329 crossref_primary_10_1109_TETCI_2024_3462249 crossref_primary_10_1109_JOE_2024_3525150 crossref_primary_10_1007_s11042_024_18686_y crossref_primary_10_1109_TCSVT_2024_3516781 crossref_primary_10_1016_j_engappai_2024_109006 crossref_primary_10_1007_s00530_023_01224_5 crossref_primary_10_1016_j_sigpro_2024_109569 crossref_primary_10_1016_j_engappai_2025_110651 crossref_primary_10_1007_s11554_024_01431_x crossref_primary_10_1007_s10043_024_00941_0 crossref_primary_10_1016_j_eswa_2025_129226 crossref_primary_10_1109_TCSVT_2023_3305777 crossref_primary_10_1109_ACCESS_2023_3287559 crossref_primary_10_1016_j_neucom_2023_02_018 crossref_primary_10_1117_1_JEI_33_6_063023 crossref_primary_10_3390_biomimetics8030275 crossref_primary_10_1016_j_imavis_2023_104813 crossref_primary_10_1109_ACCESS_2024_3354169 crossref_primary_10_1016_j_engappai_2024_109437 crossref_primary_10_1016_j_optlastec_2024_111873 crossref_primary_10_1109_LGRS_2025_3545175 crossref_primary_10_3389_fmars_2024_1366815 crossref_primary_10_1109_TCSVT_2024_3412748 crossref_primary_10_1364_AO_553719 crossref_primary_10_1016_j_heliyon_2023_e22895 crossref_primary_10_1117_1_JEI_33_5_053053 crossref_primary_10_1016_j_optcom_2024_130942 crossref_primary_10_3390_app15147883 crossref_primary_10_1038_s41598_025_89109_7 crossref_primary_10_1016_j_knosys_2024_111672 crossref_primary_10_1109_TCSVT_2023_3264824 crossref_primary_10_1016_j_engappai_2024_108585 crossref_primary_10_1088_1402_4896_ad2d9c crossref_primary_10_1109_ACCESS_2023_3240648 crossref_primary_10_1109_ACCESS_2024_3405559 crossref_primary_10_3390_jmse13081546 crossref_primary_10_1016_j_optlaseng_2024_108497 crossref_primary_10_1117_1_JEI_33_6_063018 crossref_primary_10_1016_j_knosys_2024_112651 crossref_primary_10_3390_rs17050759 crossref_primary_10_7717_peerj_cs_2392 crossref_primary_10_1007_s00138_025_01662_3 crossref_primary_10_1109_TNNLS_2024_3397886 crossref_primary_10_1016_j_knosys_2025_113902 crossref_primary_10_1109_TGRS_2025_3578662 crossref_primary_10_1109_JOE_2024_3501399 crossref_primary_10_1007_s00530_023_01246_z crossref_primary_10_1016_j_isprsjprs_2022_12_007 crossref_primary_10_1109_JOE_2023_3297731 crossref_primary_10_3390_jmse13081531 crossref_primary_10_1016_j_optlaseng_2024_108265 crossref_primary_10_1109_JIOT_2025_3582051 crossref_primary_10_1109_TGRS_2024_3358828 crossref_primary_10_1109_TPAMI_2025_3584921 crossref_primary_10_1109_TIP_2025_3563775 crossref_primary_10_3390_s23218983 crossref_primary_10_1109_ACCESS_2023_3318258 crossref_primary_10_1016_j_imavis_2023_104810 crossref_primary_10_3390_electronics12132882 crossref_primary_10_1016_j_optlaseng_2024_108154 crossref_primary_10_3390_s23125698 crossref_primary_10_1016_j_ecoinf_2025_103185 crossref_primary_10_1109_LGRS_2023_3296620 crossref_primary_10_1016_j_neunet_2023_11_049 crossref_primary_10_1016_j_engappai_2024_109999 crossref_primary_10_1109_TETCI_2024_3359061 crossref_primary_10_1016_j_cogr_2022_08_002 crossref_primary_10_1016_j_oceaneng_2024_116794 crossref_primary_10_1109_TIP_2023_3276332 crossref_primary_10_1117_1_JEI_31_6_063039 crossref_primary_10_1016_j_neunet_2024_106809 crossref_primary_10_1109_TCSVT_2023_3290363 crossref_primary_10_3390_electronics12194064 crossref_primary_10_1109_TGRS_2023_3315772 crossref_primary_10_1109_JOE_2022_3226202 crossref_primary_10_1016_j_patcog_2024_110928 crossref_primary_10_1109_ACCESS_2024_3420888 crossref_primary_10_1007_s12145_024_01279_6 crossref_primary_10_1080_13682199_2025_2495497 crossref_primary_10_1109_JOE_2024_3429653 crossref_primary_10_1109_TGRS_2025_3603651 crossref_primary_10_1109_TIP_2023_3286263 crossref_primary_10_1088_1361_6501_ad70d3 crossref_primary_10_1002_itl2_70047 crossref_primary_10_3390_s23104986 crossref_primary_10_1109_TGRS_2023_3346384 crossref_primary_10_1007_s12145_024_01573_3 crossref_primary_10_1007_s44295_024_00021_8 crossref_primary_10_1109_ACCESS_2023_3287932 crossref_primary_10_3390_s24020356 crossref_primary_10_3390_horticulturae9091034 crossref_primary_10_1016_j_compeleceng_2023_108724 crossref_primary_10_1109_TMM_2024_3521739 crossref_primary_10_1007_s11760_024_03047_x crossref_primary_10_1109_TCYB_2024_3365693 crossref_primary_10_1109_TBC_2025_3565888 crossref_primary_10_1007_s00530_025_01843_0 crossref_primary_10_1007_s12204_024_2735_y crossref_primary_10_3390_rs17111906 crossref_primary_10_1016_j_oceaneng_2025_122069 crossref_primary_10_3390_app15148037 crossref_primary_10_1016_j_patrec_2024_10_013 crossref_primary_10_1016_j_ins_2024_120741 crossref_primary_10_1007_s12145_024_01462_9 crossref_primary_10_1016_j_imavis_2024_104909 crossref_primary_10_1109_LSP_2023_3281255 crossref_primary_10_1109_JOE_2024_3523372 crossref_primary_10_1109_ACCESS_2023_3339817 crossref_primary_10_1016_j_patrec_2025_05_007 crossref_primary_10_1016_j_engappai_2024_108561 crossref_primary_10_1109_ACCESS_2024_3465550 crossref_primary_10_3390_jmse11061124 crossref_primary_10_1007_s11227_024_06638_0 crossref_primary_10_1016_j_optlastec_2024_111464 crossref_primary_10_1016_j_optlastec_2025_113542 crossref_primary_10_1016_j_inffus_2024_102857 crossref_primary_10_1109_TGRS_2025_3578905 crossref_primary_10_1016_j_jvcir_2023_103863 crossref_primary_10_1109_ACCESS_2024_3428568 crossref_primary_10_1016_j_engappai_2025_112050 crossref_primary_10_1016_j_displa_2025_103023 crossref_primary_10_1109_TMM_2024_3374598 crossref_primary_10_1109_TMM_2024_3521710 crossref_primary_10_1016_j_displa_2025_103143 crossref_primary_10_54732_jeecs_v10i1_3 crossref_primary_10_1049_ipr2_12922 crossref_primary_10_1016_j_optlaseng_2024_108640 crossref_primary_10_1109_LCOMM_2025_3556673 crossref_primary_10_1016_j_neunet_2023_11_008 crossref_primary_10_1109_LSP_2023_3338055 crossref_primary_10_1109_TCSVT_2024_3455353 crossref_primary_10_1109_TGRS_2023_3281741 crossref_primary_10_1016_j_isprsjprs_2024_02_004 crossref_primary_10_1016_j_eswa_2024_125549 crossref_primary_10_1109_LGRS_2022_3225215 crossref_primary_10_3390_info16080627 crossref_primary_10_1109_JOE_2023_3334478 crossref_primary_10_1007_s00371_023_02880_4 crossref_primary_10_1016_j_compeleceng_2022_108570 crossref_primary_10_1016_j_compeleceng_2023_108990 crossref_primary_10_1007_s00138_024_01647_8 crossref_primary_10_1016_j_displa_2023_102505 crossref_primary_10_1016_j_inffus_2024_102770 crossref_primary_10_1109_JOE_2023_3302888 crossref_primary_10_1016_j_displa_2025_103047 crossref_primary_10_3390_su15021029 crossref_primary_10_1016_j_asoc_2024_112000 crossref_primary_10_1109_TGRS_2025_3553557 crossref_primary_10_3389_fmars_2022_1031549 crossref_primary_10_1016_j_engappai_2024_108411 crossref_primary_10_1109_JSEN_2024_3466397 crossref_primary_10_3390_math13132153 crossref_primary_10_1109_TII_2024_3383537 crossref_primary_10_3390_f15101783 crossref_primary_10_1080_13682199_2023_2239012 crossref_primary_10_1007_s00371_024_03611_z crossref_primary_10_1109_TCSVT_2024_3504000 crossref_primary_10_1007_s12204_025_2838_0 crossref_primary_10_1016_j_optlastec_2025_113625 crossref_primary_10_1109_MNET_2025_3535810 crossref_primary_10_1109_ACCESS_2023_3276877 crossref_primary_10_1016_j_optlastec_2025_113868 crossref_primary_10_1016_j_displa_2024_102900 crossref_primary_10_1016_j_compeleceng_2023_108890 crossref_primary_10_1016_j_rineng_2025_104124 crossref_primary_10_1016_j_sigpro_2025_110241 crossref_primary_10_1016_j_compeleceng_2023_108896 crossref_primary_10_1038_s41598_024_73243_9 crossref_primary_10_3390_jmse12081383 crossref_primary_10_1016_j_patcog_2025_111395 crossref_primary_10_1109_TIP_2022_3216208 crossref_primary_10_3390_jmse11030604 |
| Cites_doi | 10.1109/CVPR46437.2021.01042 10.1109/TIP.2016.2612882 10.1109/TMM.2019.2933334 10.1016/j.neucom.2020.03.091 10.1016/j.compeleceng.2017.12.006 10.1016/j.image.2020.115978 10.1016/j.patrec.2017.05.023 10.1109/TCSVT.2018.2884615 10.1109/TIP.2020.2988203 10.1038/scientificamerican1277-108 10.1109/TIP.2018.2813092 10.1109/TPAMI.2020.2977624 10.1016/j.asoc.2019.105810 10.1109/LSP.2021.3072563 10.1109/TIM.2020.3028400 10.1109/LSP.2015.2487369 10.1016/j.patcog.2019.107038 10.1109/TGRS.2020.3033407 10.1016/j.optlaseng.2004.10.005 10.1016/j.compeleceng.2021.106981 10.1016/j.image.2020.115892 10.1016/j.image.2021.116250 10.1145/3474085.3475563 10.1109/LSP.2018.2792050 10.1109/TNNLS.2019.2926481 10.1109/TPAMI.2021.3063604 10.1109/TPAMI.2008.85 10.1109/TIP.2021.3076367 10.1016/j.compag.2021.106585 10.1109/TITS.2022.3145815 10.1109/TCSVT.2021.3114230 10.1109/LSP.2019.2932189 10.1109/TIP.2018.2887029 10.1109/TIP.2011.2179666 10.1109/TPAMI.2012.213 10.1016/j.optlaseng.2021.106777 10.1109/JOE.2019.2911447 10.1016/j.image.2020.116030 10.1016/j.compag.2020.105608 10.1016/j.engappai.2021.104171 10.1016/j.compag.2017.07.021 10.1109/TPAMI.2016.2613862 10.1109/48.50695 10.1016/j.optlastec.2018.05.034 10.1109/MCG.2016.26 10.1109/LRA.2017.2730363 10.1109/CVPR.2019.00178 10.1109/TIP.2019.2951304 10.1109/JOE.2015.2469915 10.1109/TIP.2019.2955241 10.1109/CVPR.2012.6247661 10.1023/B:VISI.0000029664.99615.94 10.1109/TIP.2019.2919947 10.1109/TNNLS.2020.2996498 10.1016/j.image.2022.116684 10.1109/LRA.2020.2974710 10.1109/TIE.2019.2893840 10.1109/TCSI.2017.2751671 10.1109/TIP.2017.2759252 10.1109/JOE.2022.3140563 10.1109/CVPRW.2017.136 10.1109/CVPR.2019.01051 10.1109/TCSVT.2019.2958950 10.1109/CVPR.2013.407 10.1109/TCSVT.2019.2963772 10.1109/TIP.2017.2663846 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
| DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| DOI | 10.1109/TIP.2022.3177129 |
| DatabaseName | IEEE Xplore (IEEE) IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE/IET Electronic Library CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitleList | PubMed MEDLINE - Academic Technology Research Database |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Applied Sciences Engineering |
| EISSN | 1941-0042 |
| EndPage | 4010 |
| ExternalDocumentID | 35657839 10_1109_TIP_2022_3177129 9788535 |
| Genre | orig-research Journal Article |
| GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 62171252; 61701245; 62071272; 61701247; 62001158 funderid: 10.13039/501100001809 – fundername: China Postdoctoral Science Foundation grantid: 2019M660438 funderid: 10.13039/501100002858 – fundername: Postdoctoral Science Foundation of China grantid: 2021M701903 funderid: 10.13039/501100002858 – fundername: MindSpore, Compute Architecture for Neural Networks (CANN), and Ascend Artificial Intelligence (AI) Processor – fundername: National Key Research and Development Program of China grantid: 2020AAA0130000 funderid: 10.13039/501100012166 |
| GroupedDBID | --- -~X .DC 0R~ 29I 4.4 53G 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABFSI ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P HZ~ H~9 ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 MS~ O9- OCL P2P RIA RIE RNS TAE TN5 VH1 AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| ID | FETCH-LOGICAL-c347t-ac8677c5ce2b8600de0f8de3fb338355d03b57723279c29da1a2ded6fd93d03e3 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 540 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000809404700010&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 1057-7149 1941-0042 |
| IngestDate | Sat Sep 27 18:35:15 EDT 2025 Mon Jun 30 10:14:40 EDT 2025 Mon Jul 21 05:44:15 EDT 2025 Sat Nov 29 08:06:30 EST 2025 Tue Nov 18 22:41:41 EST 2025 Wed Aug 27 02:04:22 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c347t-ac8677c5ce2b8600de0f8de3fb338355d03b57723279c29da1a2ded6fd93d03e3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0003-2609-2460 0000-0002-7143-9569 0000-0003-2495-4469 0000-0001-7484-7261 0000-0003-2749-9916 |
| PMID | 35657839 |
| PQID | 2675049057 |
| PQPubID | 85429 |
| PageCount | 14 |
| ParticipantIDs | proquest_miscellaneous_2673358722 proquest_journals_2675049057 pubmed_primary_35657839 crossref_citationtrail_10_1109_TIP_2022_3177129 crossref_primary_10_1109_TIP_2022_3177129 ieee_primary_9788535 |
| PublicationCentury | 2000 |
| PublicationDate | 2022-01-01 |
| PublicationDateYYYYMMDD | 2022-01-01 |
| PublicationDate_xml | – month: 01 year: 2022 text: 2022-01-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States – name: New York |
| PublicationTitle | IEEE transactions on image processing |
| PublicationTitleAbbrev | TIP |
| PublicationTitleAlternate | IEEE Trans Image Process |
| PublicationYear | 2022 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref13 ref57 ref12 ref56 ref15 ref59 ref14 ref58 ref53 ref52 ref11 ref55 ref10 ref54 ref17 ref16 ref19 ref18 ref51 ref50 ref46 ref45 ref48 ref47 ref42 ref41 ref44 ref43 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 ref35 ref34 ref37 ref36 ref31 ref30 ref33 ref32 ref2 ref1 ref39 ref38 ref24 ref23 ref67 ref26 ref25 ref20 ref64 ref63 ref22 ref66 ref21 ref65 ref28 ref27 ref29 ref60 ref62 ref61 |
| References_xml | – ident: ref33 doi: 10.1109/CVPR46437.2021.01042 – ident: ref27 doi: 10.1109/TIP.2016.2612882 – ident: ref46 doi: 10.1109/TMM.2019.2933334 – ident: ref57 doi: 10.1016/j.neucom.2020.03.091 – ident: ref62 doi: 10.1016/j.compeleceng.2017.12.006 – ident: ref53 doi: 10.1016/j.image.2020.115978 – ident: ref20 doi: 10.1016/j.patrec.2017.05.023 – ident: ref12 doi: 10.1109/TCSVT.2018.2884615 – ident: ref67 doi: 10.1109/TIP.2020.2988203 – ident: ref58 doi: 10.1038/scientificamerican1277-108 – ident: ref30 doi: 10.1109/TIP.2018.2813092 – ident: ref31 doi: 10.1109/TPAMI.2020.2977624 – ident: ref37 doi: 10.1016/j.asoc.2019.105810 – ident: ref21 doi: 10.1109/LSP.2021.3072563 – ident: ref2 doi: 10.1109/TIM.2020.3028400 – ident: ref60 doi: 10.1109/LSP.2015.2487369 – ident: ref51 doi: 10.1016/j.patcog.2019.107038 – ident: ref43 doi: 10.1109/TGRS.2020.3033407 – ident: ref22 doi: 10.1016/j.optlaseng.2004.10.005 – ident: ref38 doi: 10.1016/j.compeleceng.2021.106981 – ident: ref8 doi: 10.1016/j.image.2020.115892 – ident: ref17 doi: 10.1016/j.image.2021.116250 – ident: ref3 doi: 10.1145/3474085.3475563 – ident: ref4 doi: 10.1109/LSP.2018.2792050 – ident: ref44 doi: 10.1109/TNNLS.2019.2926481 – ident: ref45 doi: 10.1109/TPAMI.2021.3063604 – ident: ref10 doi: 10.1109/TPAMI.2008.85 – ident: ref52 doi: 10.1109/TIP.2021.3076367 – ident: ref15 doi: 10.1016/j.compag.2021.106585 – ident: ref18 doi: 10.1109/TITS.2022.3145815 – ident: ref13 doi: 10.1109/TCSVT.2021.3114230 – ident: ref55 doi: 10.1109/LSP.2019.2932189 – ident: ref48 doi: 10.1109/TIP.2018.2887029 – ident: ref25 doi: 10.1109/TIP.2011.2179666 – ident: ref59 doi: 10.1109/TPAMI.2012.213 – ident: ref23 doi: 10.1016/j.optlaseng.2021.106777 – ident: ref65 doi: 10.1109/CVPR46437.2021.01042 – ident: ref9 doi: 10.1109/JOE.2019.2911447 – ident: ref14 doi: 10.1016/j.image.2020.116030 – ident: ref41 doi: 10.1016/j.compag.2020.105608 – ident: ref34 doi: 10.1016/j.engappai.2021.104171 – ident: ref36 doi: 10.1016/j.compag.2017.07.021 – ident: ref11 doi: 10.1109/TPAMI.2016.2613862 – ident: ref19 doi: 10.1109/48.50695 – ident: ref24 doi: 10.1016/j.optlastec.2018.05.034 – ident: ref26 doi: 10.1109/MCG.2016.26 – ident: ref49 doi: 10.1109/LRA.2017.2730363 – ident: ref32 doi: 10.1109/CVPR.2019.00178 – ident: ref56 doi: 10.1109/TIP.2019.2951304 – ident: ref61 doi: 10.1109/JOE.2015.2469915 – ident: ref7 doi: 10.1109/TIP.2019.2955241 – ident: ref39 doi: 10.1109/CVPR.2012.6247661 – ident: ref63 doi: 10.1023/B:VISI.0000029664.99615.94 – ident: ref42 doi: 10.1109/TIP.2019.2919947 – ident: ref47 doi: 10.1109/TNNLS.2020.2996498 – ident: ref35 doi: 10.1016/j.image.2022.116684 – ident: ref50 doi: 10.1109/LRA.2020.2974710 – ident: ref1 doi: 10.1109/TIE.2019.2893840 – ident: ref29 doi: 10.1109/TCSI.2017.2751671 – ident: ref40 doi: 10.1109/TIP.2017.2759252 – ident: ref5 doi: 10.1109/JOE.2022.3140563 – ident: ref54 doi: 10.1109/CVPRW.2017.136 – ident: ref66 doi: 10.1109/CVPR.2019.01051 – ident: ref16 doi: 10.1109/TCSVT.2019.2958950 – ident: ref64 doi: 10.1109/CVPR.2013.407 – ident: ref6 doi: 10.1109/TCSVT.2019.2963772 – ident: ref28 doi: 10.1109/TIP.2017.2663846 |
| SSID | ssj0014516 |
| Score | 2.7346776 |
| Snippet | Underwater images typically suffer from color deviations and low visibility due to the wavelength-dependent light absorption and scattering. To deal with these... |
| SourceID | proquest pubmed crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 3997 |
| SubjectTerms | Attenuation Channel estimation Color color correction contrast enhancement Degradation Electromagnetic absorption Image color analysis Image contrast Image enhancement Image segmentation Imaging Learning systems light scattering Low visibility Underwater Underwater image enhancement underwater imaging |
| Title | Underwater Image Enhancement via Minimal Color Loss and Locally Adaptive Contrast Enhancement |
| URI | https://ieeexplore.ieee.org/document/9788535 https://www.ncbi.nlm.nih.gov/pubmed/35657839 https://www.proquest.com/docview/2675049057 https://www.proquest.com/docview/2673358722 |
| Volume | 31 |
| WOSCitedRecordID | wos000809404700010&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 1941-0042 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014516 issn: 1057-7149 databaseCode: RIE dateStart: 19920101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3dS-QwEB9WOcR7OM-vs-cHEXw5uLrdZNs0jyKKCyo-KOzLUdIkxYXdrnR3Ff97Z9Ju2Qfv4N4KmU7TzmTym0xnBuCsJwppkqgIeaJs2EeiUFnZCxNrlCB8YRKfKHwr7-_T4VA9dOB3mwvjnPM_n7lzuvSxfDs1Czoqo2qwuLvEa7AmpaxztdqIATWc9ZHNWIYSYf8yJBmp7uPgAR1BztE_lRL3t03YEBTtS6lD-Mpu5Nur_B1p-h3neuv_5vodvjXIkl3UqrANHVfuwFaDMlmzhmc78HWlBOEu_PGNj94QcVZsMEHjwq7KZ9IEYs9eR5rdjcrRBBlfopms2C2-BtOlxQuU7vidXVj9QiaTUZ2rSs_mqwz24On66vHyJmxaLoRG9OU81Ibq25nYOJ6niIWsi4rUOlHk5MrGsY1EHiMgF1wqw5XVPc2ts0lhlcAxJ_ZhvZyW7gAYIaOcojwyKfp50aMc3kRrjRZWKhRQAN3lp89MU4-c2mKMM--XRCpDuWUkt6yRWwC_2jte6loc_6DdJZm0dI04AjhaSjdrFuss4wnVuFeoOQGctsO4zCh2oks3XXgaIeJUch7Aj1orWt5LZfr5-TMPYZNmVp_bHMH6vFq4Y_hiXuejWXWCujxMT7wufwATLuyl |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1RT9wwDLYYoMEegMHYOmBk0l4mraNN2qZ5RAjEaceJh0PiZarSJNVOgh7q3THt389Oe9U9bJP2Vimum9aO8zmubYBPsaikyaIq5JmyYYJEobIyDjNrlCB8YTKfKDyUo1F-f69u1-BLnwvjnPM_n7mvdOlj-XZqFnRURtVgcXdJX8BGmiQ8brO1-pgBtZz1sc1UhhKB_zIoGamz8eAWXUHO0UOVEne4bXgpKN6XU4_wlf3IN1j5O9b0e87V7v_Ndg92OmzJzltleA1rrt6H3Q5nsm4Vz_bh1UoRwgP47lsf_UTM2bDBI5oXdln_IF0g9ux5otnNpJ48IuMLNJQNG-JrMF1bvED5Pvxi51Y_kdFkVOmq0bP5KoM3cHd1Ob64DrumC6ERiZyH2lCFO5Max8sc0ZB1UZVbJ6qSnNk0tZEoU4TkgktluLI61tw6m1VWCRxz4hDW62nt3gEjbFRSnEdmVVJWMWXxZlprtLFSoYACOFt--sJ0FcmpMcZD4T2TSBUot4LkVnRyC-Bzf8dTW43jH7QHJJOerhNHAMdL6Rbdcp0VPKMq9wo1J4CP_TAuNIqe6NpNF55GiDSXnAfwttWKnvdSmd7_-ZmnsHU9vhkWw8Ho2xFs0yzbU5xjWJ83C3cCm-Z5Ppk1H7xG_wb02e8E |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Underwater+Image+Enhancement+via+Minimal+Color+Loss+and+Locally+Adaptive+Contrast+Enhancement&rft.jtitle=IEEE+transactions+on+image+processing&rft.au=Zhang%2C+Weidong&rft.au=Zhuang%2C+Peixian&rft.au=Sun%2C+Hai-Han&rft.au=Li%2C+Guohou&rft.date=2022-01-01&rft.issn=1057-7149&rft.eissn=1941-0042&rft.volume=31&rft.spage=3997&rft.epage=4010&rft_id=info:doi/10.1109%2FTIP.2022.3177129&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TIP_2022_3177129 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1057-7149&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1057-7149&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1057-7149&client=summon |