Learning a Deep Single Image Contrast Enhancer from Multi-Exposure Images

Due to the poor lighting condition and limited dynamic range of digital imaging devices, the recorded images are often under-/over-exposed and with low contrast. Most of previous single image contrast enhancement (SICE) methods adjust the tone curve to correct the contrast of an input image. Those m...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE transactions on image processing Ročník 27; číslo 4; s. 2049 - 2062
Hlavní autoři: Cai, Jianrui, Gu, Shuhang, Zhang, Lei
Médium: Journal Article
Jazyk:angličtina
Vydáno: United States IEEE 01.04.2018
Témata:
ISSN:1057-7149, 1941-0042, 1941-0042
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Due to the poor lighting condition and limited dynamic range of digital imaging devices, the recorded images are often under-/over-exposed and with low contrast. Most of previous single image contrast enhancement (SICE) methods adjust the tone curve to correct the contrast of an input image. Those methods, however, often fail in revealing image details because of the limited information in a single image. On the other hand, the SICE task can be better accomplished if we can learn extra information from appropriately collected training data. In this paper, we propose to use the convolutional neural network (CNN) to train a SICE enhancer. One key issue is how to construct a training data set of low-contrast and high-contrast image pairs for end-to-end CNN learning. To this end, we build a large-scale multi-exposure image data set, which contains 589 elaborately selected high-resolution multi-exposure sequences with 4,413 images. Thirteen representative multi-exposure image fusion and stack-based high dynamic range imaging algorithms are employed to generate the contrast enhanced images for each sequence, and subjective experiments are conducted to screen the best quality one as the reference image of each scene. With the constructed data set, a CNN can be easily trained as the SICE enhancer to improve the contrast of an under-/over-exposure image. Experimental results demonstrate the advantages of our method over existing SICE methods with a significant margin.
AbstractList Due to the poor lighting condition and limited dynamic range of digital imaging devices, the recorded images are often under-/over-exposed and with low contrast. Most of previous single image contrast enhancement (SICE) methods adjust the tone curve to correct the contrast of an input image. Those methods, however, often fail in revealing image details because of the limited information in a single image. On the other hand, the SICE task can be better accomplished if we can learn extra information from appropriately collected training data. In this paper, we propose to use the convolutional neural network (CNN) to train a SICE enhancer. One key issue is how to construct a training data set of low-contrast and high-contrast image pairs for end-to-end CNN learning. To this end, we build a large-scale multi-exposure image data set, which contains 589 elaborately selected high-resolution multi-exposure sequences with 4,413 images. Thirteen representative multi-exposure image fusion and stack-based high dynamic range imaging algorithms are employed to generate the contrast enhanced images for each sequence, and subjective experiments are conducted to screen the best quality one as the reference image of each scene. With the constructed data set, a CNN can be easily trained as the SICE enhancer to improve the contrast of an under-/over-exposure image. Experimental results demonstrate the advantages of our method over existing SICE methods with a significant margin.
Due to the poor lighting condition and limited dynamic range of digital imaging devices, the recorded images are often under-/over-exposed and with low contrast. Most of previous single image contrast enhancement (SICE) methods adjust the tone curve to correct the contrast of an input image. Those methods, however, often fail in revealing image details because of the limited information in a single image. On the other hand, the SICE task can be better accomplished if we can learn extra information from appropriately collected training data. In this work, we propose to use the convolutional neural network (CNN) to train a SICE enhancer. One key issue is how to construct a training dataset of low-contrast and high-contrast image pairs for end-to-end CNN learning. To this end, we build a large-scale multi-exposure image dataset, which contains 589 elaborately selected high-resolution multi-exposure sequences with 4,413 images. Thirteen representative multi-exposure image fusion and stack-based high dynamic range imaging algorithms are employed to generate the contrast enhanced images for each sequence, and subjective experiments are conducted to screen the best quality one as the reference image of each scene. With the constructed dataset, a CNN can be easily trained as the SICE enhancer to improve the contrast of an under-/over-exposure image. Experimental results demonstrate the advantages of our method over existing SICE methods with a significant margin.Due to the poor lighting condition and limited dynamic range of digital imaging devices, the recorded images are often under-/over-exposed and with low contrast. Most of previous single image contrast enhancement (SICE) methods adjust the tone curve to correct the contrast of an input image. Those methods, however, often fail in revealing image details because of the limited information in a single image. On the other hand, the SICE task can be better accomplished if we can learn extra information from appropriately collected training data. In this work, we propose to use the convolutional neural network (CNN) to train a SICE enhancer. One key issue is how to construct a training dataset of low-contrast and high-contrast image pairs for end-to-end CNN learning. To this end, we build a large-scale multi-exposure image dataset, which contains 589 elaborately selected high-resolution multi-exposure sequences with 4,413 images. Thirteen representative multi-exposure image fusion and stack-based high dynamic range imaging algorithms are employed to generate the contrast enhanced images for each sequence, and subjective experiments are conducted to screen the best quality one as the reference image of each scene. With the constructed dataset, a CNN can be easily trained as the SICE enhancer to improve the contrast of an under-/over-exposure image. Experimental results demonstrate the advantages of our method over existing SICE methods with a significant margin.
Due to the poor lighting condition and limited dynamic range of digital imaging devices, the recorded images are often under-/over-exposed and with low contrast. Most of previous single image contrast enhancement (SICE) methods adjust the tone curve to correct the contrast of an input image. Those methods, however, often fail in revealing image details because of the limited information in a single image. On the other hand, the SICE task can be better accomplished if we can learn extra information from appropriately collected training data. In this work, we propose to use the convolutional neural network (CNN) to train a SICE enhancer. One key issue is how to construct a training dataset of low-contrast and high-contrast image pairs for end-to-end CNN learning. To this end, we build a large-scale multi-exposure image dataset, which contains 589 elaborately selected high-resolution multi-exposure sequences with 4,413 images. Thirteen representative multi-exposure image fusion and stack-based high dynamic range imaging algorithms are employed to generate the contrast enhanced images for each sequence, and subjective experiments are conducted to screen the best quality one as the reference image of each scene. With the constructed dataset, a CNN can be easily trained as the SICE enhancer to improve the contrast of an under-/over-exposure image. Experimental results demonstrate the advantages of our method over existing SICE methods with a significant margin.
Author Shuhang Gu
Jianrui Cai
Lei Zhang
Author_xml – sequence: 1
  givenname: Jianrui
  surname: Cai
  fullname: Cai, Jianrui
– sequence: 2
  givenname: Shuhang
  surname: Gu
  fullname: Gu, Shuhang
– sequence: 3
  givenname: Lei
  surname: Zhang
  fullname: Zhang, Lei
BackLink https://www.ncbi.nlm.nih.gov/pubmed/29994747$$D View this record in MEDLINE/PubMed
BookMark eNp9kM1LwzAYh4NM3IfeBUF69NL5Jk2b9ChzamGi4DyXLH07K_0yaUH_ezNXd_DgKS_h-b0fz5SM6qZGQs4pzCmF-HqdPM8ZUDlnIuaMyiMyoTGnPgBnI1dDKHxBeTwmU2vfASgPaXRCxiyOYy64mJBkhcrURb31lHeL2Hovri7RSyq1RW_R1J1RtvOW9ZuqNRovN03lPfZlV_jLz7axvRlYe0qOc1VaPBveGXm9W64XD_7q6T5Z3Kx8Hciw80UWAcpIyYDpIMsFKqm0dL9aqgxzzd3SOQITIeciC2SkwzyEHDKudKT5JpiRq33f1jQfPdourQqrsSxVjU1vUwaRDDgLARx6OaD9psIsbU1RKfOV_p7vANgD2jTWGswPCIV0Zzh1htOd4XQw7CLRn4guOtUVP6aK8r_gxT5YIOJhjmRh7LYNvgEHMYcr
CODEN IIPRE4
CitedBy_id crossref_primary_10_1007_s11263_024_02266_6
crossref_primary_10_1109_TCSVT_2021_3073371
crossref_primary_10_1109_TCSVT_2023_3292940
crossref_primary_10_1002_cpe_5841
crossref_primary_10_1049_ipr2_12173
crossref_primary_10_1109_TPAMI_2020_2984244
crossref_primary_10_3390_app12126227
crossref_primary_10_1049_ipr2_70059
crossref_primary_10_1016_j_displa_2021_102091
crossref_primary_10_1109_TITS_2023_3342799
crossref_primary_10_1109_TMM_2020_2982045
crossref_primary_10_3390_sym11040574
crossref_primary_10_1016_j_infrared_2021_103698
crossref_primary_10_1016_j_jvcir_2019_06_002
crossref_primary_10_1109_ACCESS_2021_3137993
crossref_primary_10_1007_s10846_019_01124_9
crossref_primary_10_1016_j_eswa_2023_121958
crossref_primary_10_1109_TNNLS_2021_3052903
crossref_primary_10_1109_TMM_2025_3535390
crossref_primary_10_3390_app122110767
crossref_primary_10_1109_TCSVT_2022_3195996
crossref_primary_10_1016_j_neucom_2024_128146
crossref_primary_10_1016_j_neucom_2024_129236
crossref_primary_10_1109_ACCESS_2021_3078457
crossref_primary_10_1016_j_heliyon_2023_e14558
crossref_primary_10_1186_s13640_023_00611_2
crossref_primary_10_1109_LSENS_2024_3380889
crossref_primary_10_1016_j_image_2021_116527
crossref_primary_10_1016_j_jvcir_2024_104242
crossref_primary_10_1038_s41598_025_92161_y
crossref_primary_10_1007_s11042_022_13275_3
crossref_primary_10_3390_wevj16020072
crossref_primary_10_1109_JAS_2024_124263
crossref_primary_10_1364_AO_457726
crossref_primary_10_1016_j_inffus_2023_101895
crossref_primary_10_1016_j_aei_2025_103463
crossref_primary_10_1016_j_jvcir_2024_104127
crossref_primary_10_1016_j_image_2023_117059
crossref_primary_10_1109_ACCESS_2021_3068534
crossref_primary_10_1109_ACCESS_2025_3565727
crossref_primary_10_1007_s13351_021_1138_3
crossref_primary_10_1109_TITS_2023_3308894
crossref_primary_10_1109_TCSVT_2023_3233989
crossref_primary_10_1109_TIP_2023_3242824
crossref_primary_10_1007_s00371_024_03784_7
crossref_primary_10_3390_app11052013
crossref_primary_10_1007_s11760_025_03832_2
crossref_primary_10_1109_TCSVT_2018_2828141
crossref_primary_10_1016_j_patcog_2024_111076
crossref_primary_10_1108_IR_05_2024_0215
crossref_primary_10_1007_s10044_020_00908_2
crossref_primary_10_1016_j_cam_2023_115435
crossref_primary_10_3390_s20185300
crossref_primary_10_1016_j_jnlssr_2024_10_001
crossref_primary_10_1109_TGRS_2024_3473020
crossref_primary_10_3390_s25061825
crossref_primary_10_1109_TIM_2024_3353285
crossref_primary_10_1007_s11263_021_01466_8
crossref_primary_10_1016_j_image_2024_117229
crossref_primary_10_1109_TIP_2020_2970541
crossref_primary_10_1109_TPAMI_2021_3123686
crossref_primary_10_1109_TCSVT_2023_3311766
crossref_primary_10_1109_JOE_2023_3245686
crossref_primary_10_1109_TMM_2023_3254141
crossref_primary_10_1038_s41598_025_95329_8
crossref_primary_10_1109_TCSVT_2023_3299232
crossref_primary_10_1016_j_heliyon_2023_e23241
crossref_primary_10_1017_S0373463322000467
crossref_primary_10_1007_s00034_023_02591_0
crossref_primary_10_3390_s24155019
crossref_primary_10_1016_j_inffus_2021_10_006
crossref_primary_10_1002_sdtp_17731
crossref_primary_10_3390_s20164614
crossref_primary_10_1016_j_cviu_2024_104218
crossref_primary_10_1016_j_jvcir_2024_104148
crossref_primary_10_1016_j_patcog_2025_112203
crossref_primary_10_1117_1_JEI_31_5_053001
crossref_primary_10_1007_s11042_021_11590_9
crossref_primary_10_1109_TPAMI_2024_3524538
crossref_primary_10_1016_j_cag_2023_03_004
crossref_primary_10_1109_TCI_2023_3288300
crossref_primary_10_1016_j_neucom_2025_129726
crossref_primary_10_1109_TPAMI_2024_3487361
crossref_primary_10_3390_math11102404
crossref_primary_10_3390_s25175521
crossref_primary_10_1007_s11760_024_03431_7
crossref_primary_10_1109_TITS_2021_3117868
crossref_primary_10_1007_s11042_024_20086_1
crossref_primary_10_3390_s23187763
crossref_primary_10_1016_j_imavis_2023_104693
crossref_primary_10_3390_s25082500
crossref_primary_10_1007_s11042_020_09562_6
crossref_primary_10_1109_ACCESS_2021_3080331
crossref_primary_10_3390_e26020139
crossref_primary_10_1109_TCSVT_2022_3181781
crossref_primary_10_3390_electronics10222756
crossref_primary_10_3390_e26030184
crossref_primary_10_1007_s11042_021_10614_8
crossref_primary_10_1007_s11760_021_01856_y
crossref_primary_10_1109_ACCESS_2019_2956747
crossref_primary_10_1109_TIP_2020_3042083
crossref_primary_10_3390_s24165246
crossref_primary_10_1016_j_cag_2023_12_014
crossref_primary_10_1016_j_dsp_2025_105179
crossref_primary_10_1016_j_optlaseng_2024_108800
crossref_primary_10_1109_TIP_2021_3050850
crossref_primary_10_3390_en15228508
crossref_primary_10_1109_TCSVT_2021_3129691
crossref_primary_10_1016_j_inffus_2020_08_012
crossref_primary_10_1088_1742_6596_2478_6_062022
crossref_primary_10_1109_LSP_2022_3182143
crossref_primary_10_1109_TCYB_2021_3140202
crossref_primary_10_3788_PI_2025_R03
crossref_primary_10_1016_j_neucom_2021_05_063
crossref_primary_10_1109_TBME_2019_2936460
crossref_primary_10_3390_s21154986
crossref_primary_10_7746_jkros_2025_20_3_456
crossref_primary_10_1016_j_ins_2022_07_051
crossref_primary_10_3390_app14146320
crossref_primary_10_3390_app14072846
crossref_primary_10_1016_j_jksuci_2023_101635
crossref_primary_10_1007_s00530_025_01867_6
crossref_primary_10_1016_j_inffus_2022_12_002
crossref_primary_10_1109_TCSVT_2021_3129201
crossref_primary_10_1109_ACCESS_2021_3134316
crossref_primary_10_1016_j_knosys_2023_110730
crossref_primary_10_1016_j_jvcir_2024_104293
crossref_primary_10_1109_TCSVT_2022_3202692
crossref_primary_10_1016_j_jvcir_2024_104050
crossref_primary_10_1016_j_dsp_2024_104821
crossref_primary_10_1016_j_eswa_2025_127638
crossref_primary_10_1109_TMM_2023_3268867
crossref_primary_10_1111_cgf_14350
crossref_primary_10_1016_j_neunet_2025_107162
crossref_primary_10_1371_journal_pone_0294609
crossref_primary_10_1016_j_neucom_2022_12_007
crossref_primary_10_1109_TIV_2024_3451245
crossref_primary_10_1364_AO_491768
crossref_primary_10_1049_ipr2_70077
crossref_primary_10_1109_ACCESS_2020_3006525
crossref_primary_10_1016_j_kscej_2025_100410
crossref_primary_10_1016_j_patrec_2025_07_026
crossref_primary_10_1109_TIM_2025_3556902
crossref_primary_10_1016_j_jksuci_2023_101888
crossref_primary_10_17694_bajece_1415025
crossref_primary_10_3390_photonics11070623
crossref_primary_10_1016_j_displa_2024_102637
crossref_primary_10_1016_j_sigpro_2021_108280
crossref_primary_10_1109_TNNLS_2022_3190880
crossref_primary_10_1016_j_eswa_2024_126024
crossref_primary_10_1016_j_optlastec_2024_112181
crossref_primary_10_3724_SP_J_1089_2022_19719
crossref_primary_10_1016_j_anucene_2021_108207
crossref_primary_10_1016_j_micpro_2021_104357
crossref_primary_10_3390_s23031080
crossref_primary_10_1109_TCSVT_2019_2925208
crossref_primary_10_1109_TETCI_2023_3327397
crossref_primary_10_1007_s40031_024_01004_3
crossref_primary_10_1016_j_patcog_2025_111554
crossref_primary_10_1109_ACCESS_2022_3187209
crossref_primary_10_1111_cgf_15210
crossref_primary_10_1007_s10489_021_02627_5
crossref_primary_10_1016_j_sigpro_2022_108590
crossref_primary_10_1109_TMM_2022_3172882
crossref_primary_10_1016_j_ijleo_2020_165494
crossref_primary_10_1049_iet_ipr_2020_0100
crossref_primary_10_1109_TIE_2020_3013783
crossref_primary_10_1016_j_neucom_2022_07_058
crossref_primary_10_1007_s00371_024_03554_5
crossref_primary_10_1145_3498341
crossref_primary_10_1145_3569464
crossref_primary_10_1016_j_inffus_2020_08_022
crossref_primary_10_1109_TCI_2021_3063872
crossref_primary_10_1007_s00034_022_02028_0
crossref_primary_10_3390_rs15143580
crossref_primary_10_1016_j_matdes_2023_111852
crossref_primary_10_1049_ipr2_12450
crossref_primary_10_1109_ACCESS_2023_3328579
crossref_primary_10_1016_j_engappai_2019_08_008
crossref_primary_10_1109_JSEN_2023_3314898
crossref_primary_10_1109_ACCESS_2019_2938200
crossref_primary_10_1109_TIM_2023_3284141
crossref_primary_10_1007_s11042_023_17141_8
crossref_primary_10_3390_electronics12143038
crossref_primary_10_1016_j_patcog_2025_111504
crossref_primary_10_1145_3457905
crossref_primary_10_1016_j_patcog_2025_111628
crossref_primary_10_1109_ACCESS_2022_3161527
crossref_primary_10_1109_TETCI_2023_3301337
crossref_primary_10_1007_s00603_023_03490_1
crossref_primary_10_1109_TITS_2024_3495034
crossref_primary_10_1016_j_inffus_2023_101812
crossref_primary_10_1145_3735973
crossref_primary_10_1016_j_infrared_2022_104417
crossref_primary_10_1007_s00371_021_02079_5
crossref_primary_10_1007_s11831_025_10226_7
crossref_primary_10_1016_j_knosys_2025_114227
crossref_primary_10_1177_30504554251342571
crossref_primary_10_3389_fmars_2024_1321549
crossref_primary_10_1109_TCSVT_2024_3441713
crossref_primary_10_1109_TIP_2020_2963956
crossref_primary_10_1016_j_ins_2022_05_018
crossref_primary_10_1109_JSEN_2025_3554806
crossref_primary_10_1109_ACCESS_2021_3051257
crossref_primary_10_1109_TIM_2022_3232641
crossref_primary_10_1016_j_image_2023_116925
crossref_primary_10_1016_j_patcog_2022_109039
crossref_primary_10_1016_j_image_2022_116742
crossref_primary_10_1016_j_cviu_2024_104276
crossref_primary_10_1109_TMM_2025_3543047
crossref_primary_10_1007_s11760_022_02422_w
crossref_primary_10_3390_app11167754
crossref_primary_10_1007_s00371_022_02582_3
crossref_primary_10_1109_TMM_2020_3037526
crossref_primary_10_1080_15459624_2025_2499600
crossref_primary_10_1109_TIP_2019_2952716
crossref_primary_10_1007_s10489_025_06771_0
crossref_primary_10_3390_photonics10030273
crossref_primary_10_1007_s11554_024_01532_7
crossref_primary_10_3390_electronics13183713
crossref_primary_10_1109_TIP_2023_3315123
crossref_primary_10_1016_j_knosys_2022_109244
crossref_primary_10_1016_j_image_2021_116141
crossref_primary_10_1137_24M1680179
crossref_primary_10_3390_s23167306
crossref_primary_10_1109_LSP_2022_3160652
crossref_primary_10_3390_s25175353
crossref_primary_10_1016_j_image_2022_116848
crossref_primary_10_1109_TITS_2022_3165176
crossref_primary_10_1007_s00138_025_01692_x
crossref_primary_10_1016_j_engappai_2023_106755
crossref_primary_10_1016_j_neucom_2018_09_064
crossref_primary_10_1016_j_jvcir_2019_04_008
crossref_primary_10_1109_ACCESS_2020_2964823
crossref_primary_10_3389_fphy_2023_1147031
crossref_primary_10_1016_j_dsp_2025_105044
crossref_primary_10_1109_ACCESS_2019_2954912
crossref_primary_10_1016_j_neucom_2024_127915
crossref_primary_10_1016_j_dsp_2025_105048
crossref_primary_10_1016_j_image_2022_116722
crossref_primary_10_1109_TIP_2021_3135473
crossref_primary_10_1109_TMM_2019_2903413
crossref_primary_10_1109_TIP_2021_3058764
crossref_primary_10_3390_math11071620
crossref_primary_10_1109_TCSVT_2023_3343696
crossref_primary_10_1007_s00521_024_09687_x
crossref_primary_10_1016_j_eswa_2025_128308
crossref_primary_10_1109_TNNLS_2021_3088907
crossref_primary_10_1016_j_jvcir_2024_104313
crossref_primary_10_1007_s11042_023_15908_7
crossref_primary_10_1049_ipr2_12011
crossref_primary_10_1109_TMM_2020_2969790
crossref_primary_10_1016_j_ins_2020_09_066
crossref_primary_10_1016_j_engappai_2023_106969
crossref_primary_10_1016_j_compind_2023_103862
crossref_primary_10_1007_s44443_025_00102_6
crossref_primary_10_1007_s00371_021_02289_x
crossref_primary_10_1109_JPHOT_2021_3058740
crossref_primary_10_1007_s00521_021_06836_4
crossref_primary_10_1016_j_inffus_2021_06_008
crossref_primary_10_1109_LSP_2025_3547269
crossref_primary_10_1038_s41598_024_65270_3
crossref_primary_10_1109_TVCG_2025_3566377
crossref_primary_10_3390_sym17091381
crossref_primary_10_3390_rs14030771
crossref_primary_10_1007_s11263_021_01535_y
crossref_primary_10_1016_j_eswa_2025_127106
crossref_primary_10_1016_j_engappai_2023_106611
crossref_primary_10_1109_ACCESS_2022_3197629
crossref_primary_10_1145_3638772
crossref_primary_10_1007_s10586_025_05494_8
crossref_primary_10_1049_ipr2_13110
crossref_primary_10_1109_ACCESS_2020_3005022
crossref_primary_10_3390_math12081228
crossref_primary_10_1016_j_dsp_2022_103547
crossref_primary_10_1016_j_dsp_2025_105221
crossref_primary_10_1016_j_inffus_2025_103594
crossref_primary_10_1109_ACCESS_2022_3233546
crossref_primary_10_1109_LSP_2020_3029738
crossref_primary_10_1007_s00371_023_02986_9
crossref_primary_10_1016_j_imavis_2024_105102
crossref_primary_10_1007_s12200_024_00129_z
crossref_primary_10_3233_JIFS_211664
crossref_primary_10_1016_j_imavis_2025_105645
crossref_primary_10_1109_TIP_2024_3457246
crossref_primary_10_1016_j_eswa_2023_121363
crossref_primary_10_1145_3757730
crossref_primary_10_1007_s11263_021_01501_8
crossref_primary_10_1016_j_inffus_2022_09_030
crossref_primary_10_1016_j_neucom_2025_131052
crossref_primary_10_3390_electronics14122419
crossref_primary_10_1016_j_jvcir_2024_104211
crossref_primary_10_1016_j_jvcir_2025_104410
crossref_primary_10_1016_j_dsp_2022_103537
crossref_primary_10_1109_TIM_2024_3386922
crossref_primary_10_1007_s11263_020_01418_8
crossref_primary_10_1109_TAI_2023_3339092
crossref_primary_10_1016_j_image_2022_116657
crossref_primary_10_1016_j_patcog_2024_110799
crossref_primary_10_1109_TCSVT_2022_3144455
crossref_primary_10_1016_j_dsp_2025_105355
crossref_primary_10_1088_1674_1056_ad1174
crossref_primary_10_1109_ACCESS_2021_3068861
crossref_primary_10_1007_s42423_025_00175_5
crossref_primary_10_1016_j_image_2025_117276
crossref_primary_10_1007_s13042_022_01716_2
crossref_primary_10_1109_TIM_2021_3109379
crossref_primary_10_1049_ipr2_12148
crossref_primary_10_1049_ipr2_13239
crossref_primary_10_1016_j_heliyon_2024_e35831
crossref_primary_10_1109_TCE_2022_3200707
crossref_primary_10_1109_TCSVT_2024_3351933
crossref_primary_10_1007_s11042_020_10310_z
crossref_primary_10_1016_j_jvcir_2025_104402
crossref_primary_10_1109_TIP_2019_2904267
crossref_primary_10_1007_s00371_024_03694_8
crossref_primary_10_1016_j_image_2021_116466
crossref_primary_10_1007_s00371_023_03258_2
crossref_primary_10_1007_s11801_025_4090_0
crossref_primary_10_3390_app15020701
crossref_primary_10_1007_s13369_023_07923_5
crossref_primary_10_1109_ACCESS_2022_3207299
crossref_primary_10_3390_app13179645
crossref_primary_10_1016_j_cag_2025_104167
crossref_primary_10_1109_TPAMI_2022_3152562
crossref_primary_10_1016_j_imavis_2025_105660
crossref_primary_10_1109_TIP_2024_3519997
crossref_primary_10_1007_s11042_020_09919_x
crossref_primary_10_1016_j_eswa_2025_129782
crossref_primary_10_1016_j_inffus_2023_02_031
crossref_primary_10_1360_SSI_2024_0394
crossref_primary_10_1016_j_ndteint_2025_103361
crossref_primary_10_1016_j_sigpro_2022_108902
crossref_primary_10_1109_LGRS_2021_3093935
crossref_primary_10_1016_j_knosys_2023_111226
crossref_primary_10_1007_s00530_024_01298_9
crossref_primary_10_1016_j_inffus_2023_02_027
crossref_primary_10_1137_22M1543161
crossref_primary_10_1109_ACCESS_2020_3043048
crossref_primary_10_1109_TIM_2023_3267525
crossref_primary_10_1038_s41598_025_10779_4
crossref_primary_10_1109_TMM_2019_2933333
crossref_primary_10_3390_e26090726
crossref_primary_10_1016_j_ijleo_2021_167433
crossref_primary_10_1016_j_image_2019_02_001
crossref_primary_10_1016_j_infrared_2019_103039
crossref_primary_10_1038_s41598_025_95366_3
crossref_primary_10_1109_JSEN_2023_3346642
crossref_primary_10_3390_app14020822
crossref_primary_10_1016_j_jksuci_2024_102234
crossref_primary_10_1016_j_engappai_2025_110867
crossref_primary_10_1016_j_eswa_2018_03_048
crossref_primary_10_1007_s00371_025_04115_0
crossref_primary_10_1016_j_inffus_2022_07_013
crossref_primary_10_1016_j_inffus_2022_07_016
crossref_primary_10_3390_s22072457
crossref_primary_10_1109_TPAMI_2021_3063604
crossref_primary_10_1142_S0129156425401524
crossref_primary_10_1109_TIP_2020_2999855
crossref_primary_10_3390_s21227446
crossref_primary_10_1007_s00371_021_02210_6
crossref_primary_10_1007_s11831_021_09587_6
crossref_primary_10_1109_JSYST_2023_3262593
crossref_primary_10_1016_j_jvcir_2019_03_027
crossref_primary_10_1109_TPAMI_2021_3126387
crossref_primary_10_3390_app15094801
crossref_primary_10_1007_s00371_025_03875_z
crossref_primary_10_1109_TETC_2019_2943231
crossref_primary_10_1109_TIP_2020_2995048
crossref_primary_10_1145_3700136
crossref_primary_10_1016_j_compind_2019_01_008
crossref_primary_10_1016_j_eswa_2022_118920
crossref_primary_10_1016_j_sigpro_2023_109135
crossref_primary_10_3390_e27080785
crossref_primary_10_1007_s00371_021_02343_8
crossref_primary_10_3390_electronics10040383
crossref_primary_10_3389_fmars_2024_1378817
crossref_primary_10_3390_drones8010022
crossref_primary_10_1109_TIM_2024_3372230
crossref_primary_10_1109_LSP_2021_3099746
crossref_primary_10_1016_j_jvcir_2023_103795
crossref_primary_10_1016_j_cviu_2021_103260
crossref_primary_10_1049_iet_ipr_2018_5520
crossref_primary_10_1109_TIP_2021_3062184
crossref_primary_10_1007_s11042_019_7383_0
crossref_primary_10_3390_rs17183129
crossref_primary_10_1007_s00371_023_02883_1
crossref_primary_10_1016_j_neucom_2023_126378
crossref_primary_10_1142_S1793545825430059
crossref_primary_10_3390_rs15174327
crossref_primary_10_1080_13682199_2024_2343979
crossref_primary_10_1109_LSP_2022_3167331
crossref_primary_10_1109_TGRS_2022_3201530
crossref_primary_10_1007_s11760_020_01773_6
crossref_primary_10_1109_JSEN_2025_3547995
crossref_primary_10_1109_LSP_2025_3562822
crossref_primary_10_1109_TPAMI_2020_3012548
crossref_primary_10_1155_2021_5563698
crossref_primary_10_1364_OL_558705
crossref_primary_10_1016_j_sigpro_2022_108821
crossref_primary_10_1007_s10489_024_05534_7
crossref_primary_10_1007_s00521_022_07612_8
crossref_primary_10_1007_s11263_025_02542_z
crossref_primary_10_1016_j_neucom_2025_129399
crossref_primary_10_1016_j_jksuci_2021_12_005
crossref_primary_10_3390_s22186799
crossref_primary_10_1016_j_optlaseng_2024_108488
crossref_primary_10_1109_TIM_2024_3417547
crossref_primary_10_1016_j_cag_2022_04_002
crossref_primary_10_1007_s00371_024_03262_0
crossref_primary_10_1109_TPAMI_2023_3334624
crossref_primary_10_1016_j_cviu_2024_104079
crossref_primary_10_1016_j_patcog_2024_110502
crossref_primary_10_1109_TMM_2024_3379883
crossref_primary_10_1109_MCG_2020_2972522
crossref_primary_10_3390_electronics12245022
crossref_primary_10_3390_photonics10020198
crossref_primary_10_1007_s11042_024_19594_x
crossref_primary_10_1109_TIM_2025_3557114
crossref_primary_10_1016_j_asoc_2020_106492
crossref_primary_10_1109_JPROC_2023_3338272
crossref_primary_10_1049_ipr2_12418
crossref_primary_10_1007_s00371_022_02402_8
crossref_primary_10_1007_s11042_022_12429_7
crossref_primary_10_1109_TPAMI_2024_3410140
crossref_primary_10_1016_j_mineng_2024_108919
crossref_primary_10_1109_LSP_2024_3475969
crossref_primary_10_1016_j_jvcir_2025_104392
crossref_primary_10_1109_TIP_2021_3122004
crossref_primary_10_1007_s11554_025_01744_5
crossref_primary_10_1109_TIM_2019_2951864
crossref_primary_10_1109_ACCESS_2021_3118416
crossref_primary_10_1016_j_ins_2019_05_015
crossref_primary_10_1109_LSP_2021_3134943
crossref_primary_10_1109_TCE_2022_3214382
crossref_primary_10_1007_s11063_024_11565_5
crossref_primary_10_1109_TIP_2022_3140610
crossref_primary_10_1109_TCSVT_2019_2919310
crossref_primary_10_3390_rs16183501
crossref_primary_10_1016_j_cviu_2025_104496
crossref_primary_10_1007_s00530_025_01708_6
crossref_primary_10_1109_TIP_2021_3051486
crossref_primary_10_1109_TCSVT_2023_3286802
crossref_primary_10_1109_TMM_2024_3400668
crossref_primary_10_1016_j_engappai_2025_110841
crossref_primary_10_1109_TIP_2020_2981922
crossref_primary_10_1117_1_JEI_32_5_053023
crossref_primary_10_1109_LSP_2018_2877893
crossref_primary_10_1016_j_knosys_2024_111779
crossref_primary_10_3390_rs14184608
crossref_primary_10_1007_s11554_024_01588_5
crossref_primary_10_1109_ACCESS_2020_3043257
crossref_primary_10_1109_TMM_2022_3233299
crossref_primary_10_3390_electronics11172750
crossref_primary_10_1016_j_cag_2023_07_034
crossref_primary_10_1016_j_neucom_2024_127688
crossref_primary_10_1111_cgf_13833
crossref_primary_10_1016_j_displa_2025_103219
crossref_primary_10_1109_ACCESS_2021_3108879
crossref_primary_10_1007_s00530_025_01750_4
crossref_primary_10_1109_ACCESS_2020_2992749
crossref_primary_10_1016_j_eswa_2024_124301
crossref_primary_10_1109_ACCESS_2019_2923987
crossref_primary_10_1109_LSP_2023_3343972
crossref_primary_10_1007_s11760_021_02093_z
crossref_primary_10_1007_s00530_025_01940_0
crossref_primary_10_1016_j_inffus_2022_10_021
crossref_primary_10_1016_j_neunet_2024_106809
crossref_primary_10_3390_e25060932
crossref_primary_10_3390_s23239593
crossref_primary_10_1016_j_jvcir_2023_103947
crossref_primary_10_3390_rs11131557
crossref_primary_10_1109_TIP_2021_3051462
crossref_primary_10_1007_s00371_022_02412_6
crossref_primary_10_1109_TCI_2020_3001398
crossref_primary_10_1016_j_knosys_2024_112324
crossref_primary_10_1016_j_dsp_2023_104054
crossref_primary_10_1016_j_patcog_2023_109775
crossref_primary_10_1007_s11263_024_01995_y
crossref_primary_10_1016_j_neucom_2021_05_025
crossref_primary_10_1016_j_image_2020_115892
crossref_primary_10_1109_TIM_2023_3317384
crossref_primary_10_1016_j_jvcir_2023_103932
crossref_primary_10_1109_TPAMI_2020_3026740
crossref_primary_10_1016_j_compeleceng_2024_109622
crossref_primary_10_1016_j_optlastec_2025_113686
crossref_primary_10_1109_ACCESS_2020_3046268
crossref_primary_10_1016_j_patrec_2020_07_041
crossref_primary_10_1109_ACCESS_2021_3122540
crossref_primary_10_1109_TCSVT_2022_3186880
crossref_primary_10_3390_rs13142768
crossref_primary_10_1016_j_inffus_2024_102639
crossref_primary_10_3390_s22010024
crossref_primary_10_3390_sym14061165
crossref_primary_10_3390_electronics10162029
crossref_primary_10_1016_j_cviu_2020_103079
crossref_primary_10_3724_SP_J_1089_2022_18833
crossref_primary_10_1016_j_inffus_2025_102931
crossref_primary_10_1016_j_knosys_2025_113827
crossref_primary_10_1109_TETCI_2021_3053253
crossref_primary_10_1007_s10489_020_02119_y
crossref_primary_10_1016_j_knosys_2024_112780
crossref_primary_10_1016_j_optlastec_2025_113550
crossref_primary_10_1007_s40747_024_01387_2
crossref_primary_10_1049_iet_ipr_2019_1147
crossref_primary_10_1109_LSP_2020_2965824
crossref_primary_10_1016_j_neucom_2025_129572
crossref_primary_10_1117_1_JEI_31_6_063050
crossref_primary_10_1109_TIP_2020_2987133
crossref_primary_10_1016_j_dsp_2023_104271
crossref_primary_10_1016_j_knosys_2025_113815
crossref_primary_10_1016_j_compeleceng_2023_108859
crossref_primary_10_1109_TIP_2022_3180213
crossref_primary_10_1109_TCSVT_2025_3544771
crossref_primary_10_1109_TMM_2020_3039361
crossref_primary_10_1109_TCSVT_2023_3340506
crossref_primary_10_1109_TGRS_2024_3422314
crossref_primary_10_1109_TMM_2022_3162493
crossref_primary_10_1109_TMM_2024_3521752
crossref_primary_10_1016_j_optlastec_2025_113662
crossref_primary_10_1109_TIP_2024_3512365
crossref_primary_10_1016_j_neucom_2025_129441
crossref_primary_10_3390_app142311033
crossref_primary_10_3390_app15137382
crossref_primary_10_3390_app8112332
crossref_primary_10_1016_j_neucom_2020_11_056
crossref_primary_10_1016_j_eswa_2023_119909
crossref_primary_10_1016_j_neucom_2019_12_093
crossref_primary_10_1109_TCSVT_2020_2985427
crossref_primary_10_1016_j_neucom_2021_08_044
crossref_primary_10_1016_j_cviu_2024_103930
crossref_primary_10_3390_ijgi12100400
crossref_primary_10_1016_j_jvcir_2023_103863
crossref_primary_10_1016_j_neucom_2022_08_042
crossref_primary_10_1007_s11760_025_04195_4
crossref_primary_10_1007_s11128_025_04735_4
crossref_primary_10_1109_TAI_2024_3404910
crossref_primary_10_1016_j_neunet_2024_106733
crossref_primary_10_1109_JIOT_2024_3446036
crossref_primary_10_1016_j_displa_2023_102614
crossref_primary_10_1109_TCE_2024_3377110
crossref_primary_10_1016_j_ijleo_2022_169023
crossref_primary_10_1007_s11063_022_10872_z
crossref_primary_10_1016_j_infrared_2024_105270
crossref_primary_10_1016_j_patcog_2022_108776
crossref_primary_10_3390_app142311289
crossref_primary_10_3390_s21124136
crossref_primary_10_1016_j_jvcir_2021_103175
crossref_primary_10_1007_s00371_022_02761_2
crossref_primary_10_1109_JAS_2022_105686
crossref_primary_10_1007_s11760_022_02319_8
crossref_primary_10_1109_ACCESS_2021_3057167
crossref_primary_10_1109_TNNLS_2025_3566647
crossref_primary_10_1016_j_dsp_2024_104757
crossref_primary_10_1016_j_neunet_2023_11_014
crossref_primary_10_1016_j_cag_2023_10_016
crossref_primary_10_4218_etrij_2024_0294
crossref_primary_10_1007_s00371_023_02880_4
crossref_primary_10_1109_JSEN_2025_3590815
crossref_primary_10_3390_s25165192
crossref_primary_10_1007_s10489_024_06044_2
crossref_primary_10_1016_j_inffus_2022_11_010
crossref_primary_10_32604_cmc_2024_059000
crossref_primary_10_3390_s24020673
crossref_primary_10_1016_j_neunet_2024_106622
crossref_primary_10_1117_1_JEI_31_4_043050
crossref_primary_10_1007_s11063_023_11295_0
crossref_primary_10_1109_JBHI_2024_3469630
crossref_primary_10_1016_j_patrec_2021_12_010
crossref_primary_10_1007_s00138_022_01365_z
crossref_primary_10_32604_cmc_2025_059669
crossref_primary_10_1016_j_inffus_2021_02_005
crossref_primary_10_1016_j_inffus_2024_102534
crossref_primary_10_1016_j_inffus_2024_102655
crossref_primary_10_1002_ima_22551
crossref_primary_10_1109_LSP_2022_3233005
crossref_primary_10_1007_s13042_023_01983_7
crossref_primary_10_1016_j_asoc_2024_112240
crossref_primary_10_1016_j_jvcir_2023_103887
crossref_primary_10_1016_j_autcon_2024_105404
crossref_primary_10_1109_ACCESS_2021_3119586
crossref_primary_10_3390_su15021029
crossref_primary_10_3390_jimaging10050112
crossref_primary_10_1016_j_patrec_2024_02_011
crossref_primary_10_1109_ACCESS_2019_2957775
crossref_primary_10_1109_LSP_2021_3138351
crossref_primary_10_3390_electronics12081887
crossref_primary_10_3390_app13010380
crossref_primary_10_1109_TIP_2019_2913536
crossref_primary_10_1109_TGRS_2021_3124252
crossref_primary_10_3390_land12111977
crossref_primary_10_1109_TCI_2023_3240087
crossref_primary_10_1109_TCSVT_2022_3163649
crossref_primary_10_1109_TIP_2024_3378176
crossref_primary_10_1007_s11128_020_02952_7
crossref_primary_10_1016_j_engappai_2023_107003
crossref_primary_10_1109_TMM_2023_3278385
crossref_primary_10_1109_JSEN_2025_3543768
crossref_primary_10_1007_s11263_024_02256_8
crossref_primary_10_1049_iet_ipr_2019_0118
crossref_primary_10_1016_j_displa_2025_103174
crossref_primary_10_1016_j_neucom_2020_12_057
crossref_primary_10_1109_JSEN_2024_3481416
crossref_primary_10_1016_j_jestch_2018_11_006
crossref_primary_10_1109_TCSVT_2023_3290351
crossref_primary_10_1109_TMM_2021_3089324
crossref_primary_10_1109_TIM_2022_3176881
crossref_primary_10_1109_TMM_2022_3175634
crossref_primary_10_1109_TPAMI_2021_3078906
crossref_primary_10_1117_1_JEI_32_2_023005
crossref_primary_10_1007_s11760_025_04292_4
crossref_primary_10_1016_j_ijleo_2022_169132
crossref_primary_10_1007_s11042_023_15233_z
crossref_primary_10_1016_j_jvcir_2022_103585
crossref_primary_10_1016_j_neucom_2024_128011
crossref_primary_10_1016_j_inffus_2024_102414
crossref_primary_10_1007_s00530_020_00691_4
crossref_primary_10_3390_electronics11010032
Cites_doi 10.1145/882262.882270
10.1109/ICME.2017.8019529
10.1145/3072959.3073592
10.1364/JOSA.61.000001
10.1109/TIP.2015.2436340
10.1023/A:1026501619075
10.1109/CVPR.2016.182
10.1109/83.597272
10.1109/TIP.2011.2150235
10.1109/CVPR.2010.5539850
10.1007/978-3-642-33765-9_55
10.1109/TIP.2017.2671921
10.1109/ICCV.2003.1238624
10.1016/j.cag.2013.10.001
10.1109/TIP.2014.2371234
10.1145/3072959.3073609
10.1145/1360612.1360666
10.1109/TIP.2011.2109730
10.1109/CVPR.2010.5540170
10.1145/2366145.2366222
10.1201/b11373
10.1109/TCYB.2013.2290435
10.1109/ICIF.2006.301574
10.1109/TIP.2011.2170079
10.1145/3130800.3130816
10.1109/CVPR.2017.737
10.1109/CVPR.2013.154
10.1109/TIP.2015.2442920
10.1109/TIP.2017.2651366
10.1109/CVPR.2016.304
10.1109/TIP.2016.2639450
10.1109/TIP.2009.2021548
10.1109/TPAMI.2014.2361338
10.1109/TIP.2012.2207396
10.1109/CVPR.2016.90
10.1109/TIP.2014.2349432
10.1111/j.1467-8659.2008.01171.x
10.1145/566654.566574
10.1109/ICCV.2015.123
10.1145/1401132.1401174
10.1109/TIP.2011.2157513
10.1109/TIP.2003.819861
10.1145/3130800.3130834
10.1109/TIP.2012.2226047
10.1109/83.557356
10.1109/TIP.2017.2662206
10.1109/TIP.2013.2244222
10.1109/TIP.2013.2261309
ContentType Journal Article
DBID 97E
RIA
RIE
AAYXX
CITATION
NPM
7X8
DOI 10.1109/TIP.2018.2794218
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
PubMed
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
MEDLINE - Academic
DatabaseTitleList
MEDLINE - Academic
PubMed
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Xplore
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Applied Sciences
Engineering
EISSN 1941-0042
EndPage 2062
ExternalDocumentID 29994747
10_1109_TIP_2018_2794218
8259342
Genre orig-research
Journal Article
GrantInformation_xml – fundername: Hong Kong RGC GRF
  grantid: PolyU 5313/13E
GroupedDBID ---
-~X
.DC
0R~
29I
4.4
53G
5GY
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABFSI
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
AENEX
AETIX
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
E.L
EBS
EJD
F5P
HZ~
H~9
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
RIA
RIE
RNS
TAE
TN5
VH1
AAYXX
CITATION
NPM
RIG
Z5M
7X8
ID FETCH-LOGICAL-c385t-7d60e86a832c3df7ea8ac87d6c8adefc4105fe0275447d386c5f50f0d4ac6c4b3
IEDL.DBID RIE
ISICitedReferencesCount 864
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000429464300002&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1057-7149
1941-0042
IngestDate Thu Oct 02 09:56:26 EDT 2025
Wed Feb 19 02:09:29 EST 2025
Sat Nov 29 03:21:07 EST 2025
Tue Nov 18 22:28:51 EST 2025
Wed Aug 27 02:52:25 EDT 2025
IsPeerReviewed true
IsScholarly true
Issue 4
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c385t-7d60e86a832c3df7ea8ac87d6c8adefc4105fe0275447d386c5f50f0d4ac6c4b3
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
PMID 29994747
PQID 2068342500
PQPubID 23479
PageCount 14
ParticipantIDs proquest_miscellaneous_2068342500
pubmed_primary_29994747
crossref_primary_10_1109_TIP_2018_2794218
ieee_primary_8259342
crossref_citationtrail_10_1109_TIP_2018_2794218
PublicationCentury 2000
PublicationDate 2018-04-01
PublicationDateYYYYMMDD 2018-04-01
PublicationDate_xml – month: 04
  year: 2018
  text: 2018-04-01
  day: 01
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle IEEE transactions on image processing
PublicationTitleAbbrev TIP
PublicationTitleAlternate IEEE Trans Image Process
PublicationYear 2018
Publisher IEEE
Publisher_xml – name: IEEE
References ref13
ref12
ref15
ref58
ref14
ref53
ref52
xie (ref34) 2012
ref55
ref11
ref54
ref10
ref17
ref16
ref18
ref51
ref50
ref46
jain (ref24) 2009
ref48
ref47
ref42
ref41
ref44
ref43
(ref45) 2015
ref49
ref8
ref7
ref9
ref4
ref3
ref6
ref5
li (ref59) 2014
ref40
ref35
ref36
ref31
ref33
ioffe (ref56) 2015
ref32
ref2
ref1
ref38
reinhard (ref30) 2010
xu (ref22) 2015
ronneberger (ref57) 2015
nemoto (ref37) 2015
ref26
raman (ref39) 2009
ref25
li (ref27) 2015; 24
ref20
ref21
ref28
ref29
bychkovsky (ref19) 2011
ref60
dong (ref23) 2014
References_xml – ident: ref38
  doi: 10.1145/882262.882270
– ident: ref43
  doi: 10.1109/ICME.2017.8019529
– ident: ref29
  doi: 10.1145/3072959.3073592
– ident: ref49
  doi: 10.1364/JOSA.61.000001
– start-page: 341
  year: 2012
  ident: ref34
  article-title: Image denoising and inpainting with deep neural networks
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref17
  doi: 10.1109/TIP.2015.2436340
– ident: ref18
  doi: 10.1023/A:1026501619075
– ident: ref21
  doi: 10.1109/CVPR.2016.182
– ident: ref7
  doi: 10.1109/83.597272
– ident: ref40
  doi: 10.1109/TIP.2011.2150235
– start-page: 97
  year: 2011
  ident: ref19
  article-title: Learning photographic global tonal adjustment with a database of input/output image pairs
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit (CVPR)
– ident: ref20
  doi: 10.1109/CVPR.2010.5539850
– ident: ref26
  doi: 10.1007/978-3-642-33765-9_55
– ident: ref2
  doi: 10.1109/TIP.2017.2671921
– start-page: 174
  year: 2014
  ident: ref59
  article-title: A contrast enhancement framework with JPEG artifacts suppression
  publication-title: Proc Eur Conf Comput Vis
– ident: ref11
  doi: 10.1109/ICCV.2003.1238624
– ident: ref44
  doi: 10.1016/j.cag.2013.10.001
– volume: 24
  start-page: 120
  year: 2015
  ident: ref27
  article-title: Weighted guided image filtering
  publication-title: IEEE Trans Image Process
  doi: 10.1109/TIP.2014.2371234
– ident: ref33
  doi: 10.1145/3072959.3073609
– year: 2015
  ident: ref45
  publication-title: Commercially-Available HDR Processing Software
– ident: ref54
  doi: 10.1145/1360612.1360666
– ident: ref60
  doi: 10.1109/TIP.2011.2109730
– year: 2015
  ident: ref37
  article-title: Visual attention in LDR and HDR images
  publication-title: Proc 9th Int Workshop Video Process Quality Metrics Consumer Electron (VPQM)
– year: 2010
  ident: ref30
  publication-title: High Dynamic Range Imaging Acquisition Display and Image-Based Lighting
– ident: ref25
  doi: 10.1109/CVPR.2010.5540170
– ident: ref14
  doi: 10.1145/2366145.2366222
– start-page: 1669
  year: 2015
  ident: ref22
  article-title: Deep edge-aware filters
  publication-title: Proc Int Conf Mach Learn (ICML)
– start-page: 234
  year: 2015
  ident: ref57
  article-title: U-net: Convolutional networks for biomedical image segmentation
  publication-title: Proc Int Conf Med Image Comput Comput -Assist Intervent
– ident: ref46
  doi: 10.1201/b11373
– ident: ref42
  doi: 10.1109/TCYB.2013.2290435
– start-page: 1
  year: 2009
  ident: ref39
  article-title: Bilateral filter based compositing for variable exposure photography
  publication-title: In Eurographics (Short papers)
– start-page: 184
  year: 2014
  ident: ref23
  article-title: Learning a deep convolutional network for image super-resolution
  publication-title: Proc Eur Conf Comput Vis
– ident: ref47
  doi: 10.1109/ICIF.2006.301574
– ident: ref41
  doi: 10.1109/TIP.2011.2170079
– ident: ref28
  doi: 10.1145/3130800.3130816
– ident: ref36
  doi: 10.1109/CVPR.2017.737
– ident: ref32
  doi: 10.1109/CVPR.2013.154
– ident: ref16
  doi: 10.1109/TIP.2015.2442920
– ident: ref51
  doi: 10.1109/TIP.2017.2651366
– ident: ref3
  doi: 10.1109/CVPR.2016.304
– ident: ref1
  doi: 10.1109/TIP.2016.2639450
– ident: ref4
  doi: 10.1109/TIP.2009.2021548
– ident: ref15
  doi: 10.1109/TPAMI.2014.2361338
– ident: ref52
  doi: 10.1109/TIP.2012.2207396
– ident: ref58
  doi: 10.1109/CVPR.2016.90
– ident: ref53
  doi: 10.1109/TIP.2014.2349432
– ident: ref13
  doi: 10.1111/j.1467-8659.2008.01171.x
– ident: ref50
  doi: 10.1145/566654.566574
– ident: ref55
  doi: 10.1109/ICCV.2015.123
– start-page: 769
  year: 2009
  ident: ref24
  article-title: Natural image denoising with convolutional networks
  publication-title: Proc Adv Neural Inf Process Syst
– ident: ref10
  doi: 10.1145/1401132.1401174
– start-page: 448
  year: 2015
  ident: ref56
  article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift
  publication-title: Proc Int Conf Mach Learn (ICML)
– ident: ref5
  doi: 10.1109/TIP.2011.2157513
– ident: ref48
  doi: 10.1109/TIP.2003.819861
– ident: ref31
  doi: 10.1145/3130800.3130834
– ident: ref6
  doi: 10.1109/TIP.2012.2226047
– ident: ref9
  doi: 10.1109/83.557356
– ident: ref35
  doi: 10.1109/TIP.2017.2662206
– ident: ref12
  doi: 10.1109/TIP.2013.2244222
– ident: ref8
  doi: 10.1109/TIP.2013.2261309
SSID ssj0014516
Score 2.7082233
Snippet Due to the poor lighting condition and limited dynamic range of digital imaging devices, the recorded images are often under-/over-exposed and with low...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 2049
SubjectTerms convolutional neural network
Dynamic range
Heuristic algorithms
Image sequences
Imaging
Lighting
multi-exposure image fusion
Single image contrast enhancement
Training
Training data
Title Learning a Deep Single Image Contrast Enhancer from Multi-Exposure Images
URI https://ieeexplore.ieee.org/document/8259342
https://www.ncbi.nlm.nih.gov/pubmed/29994747
https://www.proquest.com/docview/2068342500
Volume 27
WOSCitedRecordID wos000429464300002&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Xplore
  customDbUrl:
  eissn: 1941-0042
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014516
  issn: 1057-7149
  databaseCode: RIE
  dateStart: 19920101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlR3LTsMwzALEAQ68H-OlIHFBoiysaZMcEQyxC0ICpN2qLHUBCbpp3RCfj912FQdAoqeoctootmM7fgGcSIXGklgPjEcTqNi4wEb-gjOW085ADXQnq5pN6Ls70-_b-zk4a3JhELEMPsNzHpa-_HTop3xV1iZrxoaKDtx5reMqV6vxGHDD2dKzGelAk9o_c0lK237s3XMMlznvEPF1uL3HNxFU9lT5Xb0sxczN6v8WuAYrtTopLiv8r8Mc5huwWquWombcYgOWv9Ud3IReXVX1WThxjTgSDzR-Q9F7p9NFcMGqsSsmopu_ME2MBeegiDJVN-h-joZ8qVjBFlvwdNN9vLoN6p4KgQ9NNAl0Gks0sSNG9mGaaXTGeUNvvXEpZp6jPjNkX6ZSOg1N7KMskplMlfOxV4NwGxbyYY67IJC4newZekjEkSJgUuelRRk5O7Bemha0Z9uc-LrgOPe9eEtKw0PahBCTMGKSGjEtOG1mjKpiG3_AbvL-N3D11rfgeIbJhBiFvR8ux-G0oMmxIYBIyhbsVChuJpNMtooMq72fP7oPS_zrKmDnABYm4ykewqL_mLwW4yOixr45KqnxC5GN2O8
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlR1dS-QwcBBPUB_8_ti7UyP4Ilg3tmmbPB7niou6CK7gW8mmUxW0u2x35X7-zbTd4oMK9imUSQiZmcxM5gvgSCrUhsS6px1qT0XaeiZ0Z5yxnPoDNYj9rGo2Efd6-uHB3M7BSZMLg4hl8Bme8rD05adDN-WnsjZZMyZQdOH-CJXyZZWt1fgMuOVs6dsMYy8mxX_mlJSm3e_echSXPvWJ_Hxu8PFOCJVdVT5XMEtBc7H6vS2uwUqtUIo_FQWswxzmG7BaK5eiZt1iA5bfVR7chG5dV_VRWHGOOBJ3NH5B0X2l-0VwyaqxLSaikz8xVYwFZ6GIMlnX6_wbDflZsYIttuD-otP_e-nVXRU8F-hw4sVpJFFHlljZBWkWo9XWafrrtE0xcxz3mSF7M5WK00BHLsxCmclUWRc5NQi2YT4f5rgLAonfyaKhj4QcqQI6tU4alKE1A-OkbkF7dsyJq0uOc-eLl6Q0PaRJCDEJIyapEdOC42bGqCq38QXsJp9_A1cffQsOZ5hMiFXY_2FzHE4LmhxpAgilbMFOheJmMkllo8i0-vnxogeweNm_uU6uu72rX7DE26jCd37D_GQ8xT1YcG-T52K8X9Lkf-Tg204
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Learning+a+Deep+Single+Image+Contrast+Enhancer+from+Multi-Exposure+Images&rft.jtitle=IEEE+transactions+on+image+processing&rft.au=Jianrui+Cai&rft.au=Shuhang+Gu&rft.au=Lei+Zhang&rft.date=2018-04-01&rft.pub=IEEE&rft.issn=1057-7149&rft.volume=27&rft.issue=4&rft.spage=2049&rft.epage=2062&rft_id=info:doi/10.1109%2FTIP.2018.2794218&rft.externalDocID=8259342
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1057-7149&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1057-7149&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1057-7149&client=summon