Residual Dense Network for Image Restoration
Recently, deep convolutional neural network (CNN) has achieved great success for image restoration (IR) and provided hierarchical features at the same time. However, most deep CNN based IR models do not make full use of the hierarchical features from the original low-quality images; thereby, resulti...
Gespeichert in:
| Veröffentlicht in: | IEEE transactions on pattern analysis and machine intelligence Jg. 43; H. 7; S. 2480 - 2495 |
|---|---|
| Hauptverfasser: | , , , , |
| Format: | Journal Article |
| Sprache: | Englisch |
| Veröffentlicht: |
United States
IEEE
01.07.2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Schlagworte: | |
| ISSN: | 0162-8828, 1939-3539, 2160-9292, 1939-3539 |
| Online-Zugang: | Volltext |
| Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
| Abstract | Recently, deep convolutional neural network (CNN) has achieved great success for image restoration (IR) and provided hierarchical features at the same time. However, most deep CNN based IR models do not make full use of the hierarchical features from the original low-quality images; thereby, resulting in relatively-low performance. In this work, we propose a novel and efficient residual dense network (RDN) to address this problem in IR, by making a better tradeoff between efficiency and effectiveness in exploiting the hierarchical features from all the convolutional layers. Specifically, we propose residual dense block (RDB) to extract abundant local features via densely connected convolutional layers. RDB further allows direct connections from the state of preceding RDB to all the layers of current RDB, leading to a contiguous memory mechanism. To adaptively learn more effective features from preceding and current local features and stabilize the training of wider network, we proposed local feature fusion in RDB. After fully obtaining dense local features, we use global feature fusion to jointly and adaptively learn global hierarchical features in a holistic way. We demonstrate the effectiveness of RDN with several representative IR applications, single image super-resolution, Gaussian image denoising, image compression artifact reduction, and image deblurring. Experiments on benchmark and real-world datasets show that our RDN achieves favorable performance against state-of-the-art methods for each IR task quantitatively and visually. |
|---|---|
| AbstractList | Recently, deep convolutional neural network (CNN) has achieved great success for image restoration (IR) and provided hierarchical features at the same time. However, most deep CNN based IR models do not make full use of the hierarchical features from the original low-quality images, thereby resulting in relatively-low performance. In this work, we propose a novel and efficient residual dense network (RDN) to address this problem in IR, by making a better tradeoff between efficiency and effectiveness in exploiting the hierarchical features from all the convolutional layers. Specifically, we propose residual dense block (RDB) to extract abundant local features via densely connected convolutional layers. RDB further allows direct connections from the state of preceding RDB to all the layers of current RDB, leading to a contiguous memory mechanism. To adaptively learn more effective features from preceding and current local features and stabilize the training of wider network, we proposed local feature fusion in RDB. After fully obtaining dense local features, we use global feature fusion to jointly and adaptively learn global hierarchical features in a holistic way. We demonstrate the effectiveness of RDN with several representative IR applications, single image super-resolution, Gaussian image denoising, image compression artifact reduction, and image deblurring. Experiments on benchmark and real-world datasets show that our RDN achieves favorable performance against state-of-the-art methods for each IR task quantitatively and visually. Recently, deep convolutional neural network (CNN) has achieved great success for image restoration (IR) and provided hierarchical features at the same time. However, most deep CNN based IR models do not make full use of the hierarchical features from the original low-quality images; thereby, resulting in relatively-low performance. In this work, we propose a novel and efficient residual dense network (RDN) to address this problem in IR, by making a better tradeoff between efficiency and effectiveness in exploiting the hierarchical features from all the convolutional layers. Specifically, we propose residual dense block (RDB) to extract abundant local features via densely connected convolutional layers. RDB further allows direct connections from the state of preceding RDB to all the layers of current RDB, leading to a contiguous memory mechanism. To adaptively learn more effective features from preceding and current local features and stabilize the training of wider network, we proposed local feature fusion in RDB. After fully obtaining dense local features, we use global feature fusion to jointly and adaptively learn global hierarchical features in a holistic way. We demonstrate the effectiveness of RDN with several representative IR applications, single image super-resolution, Gaussian image denoising, image compression artifact reduction, and image deblurring. Experiments on benchmark and real-world datasets show that our RDN achieves favorable performance against state-of-the-art methods for each IR task quantitatively and visually.Recently, deep convolutional neural network (CNN) has achieved great success for image restoration (IR) and provided hierarchical features at the same time. However, most deep CNN based IR models do not make full use of the hierarchical features from the original low-quality images; thereby, resulting in relatively-low performance. In this work, we propose a novel and efficient residual dense network (RDN) to address this problem in IR, by making a better tradeoff between efficiency and effectiveness in exploiting the hierarchical features from all the convolutional layers. Specifically, we propose residual dense block (RDB) to extract abundant local features via densely connected convolutional layers. RDB further allows direct connections from the state of preceding RDB to all the layers of current RDB, leading to a contiguous memory mechanism. To adaptively learn more effective features from preceding and current local features and stabilize the training of wider network, we proposed local feature fusion in RDB. After fully obtaining dense local features, we use global feature fusion to jointly and adaptively learn global hierarchical features in a holistic way. We demonstrate the effectiveness of RDN with several representative IR applications, single image super-resolution, Gaussian image denoising, image compression artifact reduction, and image deblurring. Experiments on benchmark and real-world datasets show that our RDN achieves favorable performance against state-of-the-art methods for each IR task quantitatively and visually. |
| Author | Fu, Yun Zhang, Yulun Tian, Yapeng Kong, Yu Zhong, Bineng |
| Author_xml | – sequence: 1 givenname: Yulun orcidid: 0000-0002-2288-5079 surname: Zhang fullname: Zhang, Yulun email: yulun100@gmail.com organization: Department of Electrical and Computer Engineering, Northeastern University, Boston, MA, USA – sequence: 2 givenname: Yapeng surname: Tian fullname: Tian, Yapeng email: yapengtian@rochester.edu organization: Department of Computer Science, University of Rochester, Rochester, NY, USA – sequence: 3 givenname: Yu orcidid: 0000-0001-6271-4082 surname: Kong fullname: Kong, Yu email: yu.kong@rit.edu organization: B. Thomas Golisano College of Computing and Information Sciences, Rochester Institute of Technology, Rochester, NY, USA – sequence: 4 givenname: Bineng orcidid: 0000-0003-3423-1539 surname: Zhong fullname: Zhong, Bineng email: bnzhong@hqu.edu.cn organization: School of Computer Science and Technology, Huaqiao University, Xiamen, China – sequence: 5 givenname: Yun orcidid: 0000-0002-5098-2853 surname: Fu fullname: Fu, Yun email: yunfu@ece.neu.edu organization: Department of Electrical and Computer Engineering and Khoury College of Computer Science, Northeastern University, Boston, MA, USA |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/31985406$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kMlOwzAQQC0EomX5AZBQJC4cSBnbsWMfq7JVKosQnC0nmaBAGoOdCPH3pAsceuDky3sz47dHthvXICFHFEaUgr54fhzfTUcMGIyYlkowukWGjEqINdNsmwyBShYrxdSA7IXwBkATAXyXDDjVSiQgh-T8CUNVdLaOLrEJGN1j--X8e1Q6H03n9hWjHmidt23lmgOyU9o64OH63Scv11fPk9t49nAznYxncc4FbeMEs0JolCoVmS1kkRQyR6pSmRecWZ2laZkDyJxqnqEGUXKwKFKreQE0A8v3ydlq7od3n12_38yrkGNd2wZdFwzjiWSKS0h69HQDfXOdb_rrDBNcc8FEuqBO1lSXzbEwH76aW_9tfjv0gFoBuXcheCxNXrXLP7feVrWhYBbJzTK5WSQ36-S9yjbU3-n_SscrqULEP0FpmSQ85T_6pooI |
| CODEN | ITPIDJ |
| CitedBy_id | crossref_primary_10_1109_JSTARS_2021_3129622 crossref_primary_10_3390_electronics11030418 crossref_primary_10_1049_ipr2_13268 crossref_primary_10_1016_j_autcon_2023_105022 crossref_primary_10_1049_ipr2_13264 crossref_primary_10_1109_TNNLS_2022_3144630 crossref_primary_10_1002_adpr_202300154 crossref_primary_10_1109_LSP_2022_3175411 crossref_primary_10_1016_j_bspc_2025_108119 crossref_primary_10_1016_j_jvcir_2024_104117 crossref_primary_10_1186_s42492_024_00165_8 crossref_primary_10_1109_TCSVT_2022_3147788 crossref_primary_10_1109_TPAMI_2021_3087485 crossref_primary_10_1016_j_image_2023_117057 crossref_primary_10_1016_j_isprsjprs_2022_08_002 crossref_primary_10_1109_TMI_2022_3194984 crossref_primary_10_1109_TGRS_2021_3095056 crossref_primary_10_1109_TNSE_2022_3190765 crossref_primary_10_1007_s00371_024_03565_2 crossref_primary_10_1109_TGRS_2024_3363504 crossref_primary_10_1016_j_knosys_2022_109776 crossref_primary_10_1109_TIP_2020_3044214 crossref_primary_10_1007_s11760_024_03092_6 crossref_primary_10_3390_rs15020445 crossref_primary_10_1016_j_eswa_2021_115815 crossref_primary_10_1016_j_jvcir_2025_104500 crossref_primary_10_1007_s00521_024_10893_w crossref_primary_10_1109_TCSVT_2025_3556046 crossref_primary_10_1016_j_dsp_2024_104715 crossref_primary_10_3390_math10040653 crossref_primary_10_1109_JSEN_2025_3540119 crossref_primary_10_1007_s11633_023_1466_0 crossref_primary_10_1016_j_tsep_2024_102873 crossref_primary_10_1109_TCSVT_2024_3465875 crossref_primary_10_1109_TPAMI_2022_3170155 crossref_primary_10_1007_s13042_022_01725_1 crossref_primary_10_1109_TIP_2021_3093396 crossref_primary_10_1109_TRPMS_2022_3194408 crossref_primary_10_1109_ACCESS_2024_3367980 crossref_primary_10_3390_mi16010042 crossref_primary_10_1109_TGRS_2024_3489964 crossref_primary_10_3389_feart_2021_681869 crossref_primary_10_1007_s41095_022_0297_1 crossref_primary_10_1016_j_mri_2024_04_021 crossref_primary_10_1049_ell2_13057 crossref_primary_10_1016_j_jvcir_2024_104143 crossref_primary_10_1109_TIP_2022_3146625 crossref_primary_10_1016_j_bspc_2022_104062 crossref_primary_10_1109_TMM_2024_3407656 crossref_primary_10_1109_TGRS_2023_3337845 crossref_primary_10_1109_TMI_2024_3439573 crossref_primary_10_1109_TIM_2023_3343823 crossref_primary_10_1109_TGRS_2020_3047112 crossref_primary_10_1109_TNNLS_2021_3124370 crossref_primary_10_1109_JBHI_2022_3153902 crossref_primary_10_1109_TBC_2022_3152064 crossref_primary_10_1109_TPAMI_2024_3498003 crossref_primary_10_1109_TIP_2022_3226892 crossref_primary_10_1177_15910199221143168 crossref_primary_10_1109_LGRS_2023_3248069 crossref_primary_10_1109_TGRS_2021_3072381 crossref_primary_10_1002_mp_16115 crossref_primary_10_1109_TCE_2025_3565318 crossref_primary_10_1016_j_knosys_2022_109587 crossref_primary_10_1155_2022_7973404 crossref_primary_10_1109_ACCESS_2021_3132148 crossref_primary_10_1109_TPAMI_2022_3206870 crossref_primary_10_1016_j_ijleo_2023_170924 crossref_primary_10_1109_LSP_2021_3100263 crossref_primary_10_1109_TIFS_2023_3306181 crossref_primary_10_1007_s40747_024_01702_x crossref_primary_10_1016_j_patcog_2024_110291 crossref_primary_10_1007_s10489_023_04856_2 crossref_primary_10_1016_j_jvcir_2024_104288 crossref_primary_10_1109_TIP_2021_3050856 crossref_primary_10_1109_TMM_2023_3252271 crossref_primary_10_1002_jbio_202300447 crossref_primary_10_1007_s00521_024_09440_4 crossref_primary_10_3390_rs16050841 crossref_primary_10_1088_1402_4896_ad619c crossref_primary_10_1117_1_JEI_33_1_013027 crossref_primary_10_3390_electronics12183770 crossref_primary_10_1016_j_engappai_2025_110275 crossref_primary_10_1109_TCSVT_2023_3325906 crossref_primary_10_1016_j_isprsjprs_2023_04_016 crossref_primary_10_1177_02841851221141656 crossref_primary_10_1007_s10489_023_04513_8 crossref_primary_10_1109_TPAMI_2024_3406556 crossref_primary_10_3390_s23010251 crossref_primary_10_1007_s13735_022_00263_4 crossref_primary_10_1016_j_bspc_2023_104572 crossref_primary_10_1109_TMM_2022_3181457 crossref_primary_10_1016_j_jksuci_2023_101888 crossref_primary_10_1016_j_imavis_2024_105411 crossref_primary_10_1109_ACCESS_2024_3402953 crossref_primary_10_1109_ACCESS_2022_3148201 crossref_primary_10_1038_s41598_024_83944_w crossref_primary_10_1109_ACCESS_2022_3222826 crossref_primary_10_1109_ACCESS_2021_3092425 crossref_primary_10_1109_TNNLS_2021_3083504 crossref_primary_10_1049_ell2_12285 crossref_primary_10_1109_TIP_2023_3274967 crossref_primary_10_1109_TIP_2020_3042059 crossref_primary_10_1109_TIP_2021_3091902 crossref_primary_10_3390_rs14225780 crossref_primary_10_1002_ima_22941 crossref_primary_10_1088_1361_6560_ada5a1 crossref_primary_10_1007_s00330_024_11288_0 crossref_primary_10_1007_s11045_023_00866_y crossref_primary_10_1016_j_infrared_2025_105937 crossref_primary_10_1007_s11227_023_05286_0 crossref_primary_10_1109_ACCESS_2023_3335372 crossref_primary_10_1109_TCSVT_2022_3170689 crossref_primary_10_1080_01431161_2023_2221800 crossref_primary_10_1109_TCSVT_2025_3553160 crossref_primary_10_1109_TIM_2022_3216391 crossref_primary_10_1007_s10462_022_10305_2 crossref_primary_10_1016_j_compmedimag_2025_102597 crossref_primary_10_1109_TPAMI_2022_3194090 crossref_primary_10_1109_TMI_2022_3223677 crossref_primary_10_1142_S021964922550008X crossref_primary_10_1117_1_JEI_33_3_033002 crossref_primary_10_1016_j_neunet_2024_107109 crossref_primary_10_1109_TIP_2020_3033617 crossref_primary_10_1007_s11760_025_04012_y crossref_primary_10_1109_TCSVT_2021_3112548 crossref_primary_10_1088_1361_6560_ad7fc6 crossref_primary_10_1016_j_neunet_2024_106378 crossref_primary_10_2478_ijanmc_2025_0015 crossref_primary_10_1007_s41095_022_0277_5 crossref_primary_10_1016_j_engappai_2025_111438 crossref_primary_10_1109_TPAMI_2024_3357709 crossref_primary_10_1109_TCSVT_2023_3323483 crossref_primary_10_1007_s41870_022_01020_w crossref_primary_10_3390_s23083963 crossref_primary_10_1016_j_dsp_2025_105026 crossref_primary_10_1016_j_oceaneng_2025_121862 crossref_primary_10_1109_TCE_2023_3280915 crossref_primary_10_1109_TIP_2022_3150294 crossref_primary_10_1109_ACCESS_2021_3061062 crossref_primary_10_1016_j_jgsce_2024_205395 crossref_primary_10_1109_TIP_2020_3019644 crossref_primary_10_1016_j_optlastec_2023_110401 crossref_primary_10_1109_TIP_2024_3497807 crossref_primary_10_1109_TGRS_2024_3394750 crossref_primary_10_1109_TETCI_2024_3369447 crossref_primary_10_1016_j_compmedimag_2025_102492 crossref_primary_10_1109_ACCESS_2020_3022882 crossref_primary_10_1007_s11042_023_16583_4 crossref_primary_10_1007_s11063_023_11213_4 crossref_primary_10_1016_j_infrared_2023_104837 crossref_primary_10_1109_TCBB_2024_3380410 crossref_primary_10_1109_OJCS_2025_3584205 crossref_primary_10_1038_s41598_025_91808_0 crossref_primary_10_1016_j_engappai_2023_106755 crossref_primary_10_1016_j_inffus_2025_103211 crossref_primary_10_1016_j_jvcir_2024_104302 crossref_primary_10_3390_rs16101688 crossref_primary_10_1016_j_displa_2024_102782 crossref_primary_10_1007_s11760_024_03764_3 crossref_primary_10_1109_TCE_2024_3409313 crossref_primary_10_3390_rs14236118 crossref_primary_10_3389_frai_2025_1616007 crossref_primary_10_1109_JSTSP_2021_3049641 crossref_primary_10_1016_j_ejrad_2025_112232 crossref_primary_10_1109_TCSVT_2020_3035664 crossref_primary_10_1088_1361_6560_ac0d0c crossref_primary_10_1109_ACCESS_2023_3250616 crossref_primary_10_1109_TITS_2021_3120075 crossref_primary_10_1007_s11042_024_18409_3 crossref_primary_10_1109_JSTARS_2022_3220514 crossref_primary_10_1002_mp_16628 crossref_primary_10_1007_s10554_025_03403_z crossref_primary_10_1007_s00371_024_03793_6 crossref_primary_10_1007_s10489_023_04566_9 crossref_primary_10_1109_TPAMI_2024_3401048 crossref_primary_10_1109_LSP_2025_3585820 crossref_primary_10_1007_s11042_023_17844_y crossref_primary_10_1049_ipr2_12368 crossref_primary_10_1109_TIP_2022_3197976 crossref_primary_10_1007_s11263_023_01970_z crossref_primary_10_1109_TNNLS_2023_3243666 crossref_primary_10_1137_23M1545859 crossref_primary_10_1016_j_cviu_2024_104118 crossref_primary_10_1016_j_knosys_2022_109054 crossref_primary_10_1016_j_dcn_2024_101397 crossref_primary_10_1109_TMI_2021_3066318 crossref_primary_10_1109_TIM_2025_3527523 crossref_primary_10_1007_s10994_025_06814_0 crossref_primary_10_1016_j_imavis_2024_105346 crossref_primary_10_1016_j_inffus_2023_01_018 crossref_primary_10_1016_j_knosys_2025_113048 crossref_primary_10_1088_1361_6560_ad3889 crossref_primary_10_1016_j_eswa_2025_126492 crossref_primary_10_1109_ACCESS_2020_2999965 crossref_primary_10_1109_TGRS_2021_3071799 crossref_primary_10_3390_electronics13224372 crossref_primary_10_1088_1361_6560_aced34 crossref_primary_10_1016_j_patcog_2024_110676 crossref_primary_10_3390_electronics13101867 crossref_primary_10_1049_ipr2_12279 crossref_primary_10_1049_ipr2_12394 crossref_primary_10_1016_j_eswa_2024_123954 crossref_primary_10_1007_s10044_024_01393_7 crossref_primary_10_7717_peerj_cs_970 crossref_primary_10_1109_LSP_2023_3296039 crossref_primary_10_1109_TWC_2024_3364666 crossref_primary_10_1016_j_asoc_2023_110535 crossref_primary_10_1016_j_mri_2022_03_003 crossref_primary_10_1016_j_procs_2024_10_264 crossref_primary_10_1145_3566125 crossref_primary_10_1016_j_engappai_2022_105581 crossref_primary_10_1109_ACCESS_2021_3091971 crossref_primary_10_1109_ACCESS_2022_3176605 crossref_primary_10_1109_ACCESS_2021_3091975 crossref_primary_10_1109_TCSVT_2024_3351933 crossref_primary_10_1103_PhysRevD_106_063023 crossref_primary_10_1016_j_inffus_2025_103013 crossref_primary_10_1016_j_patcog_2024_110563 crossref_primary_10_1109_JSTSP_2024_3386056 crossref_primary_10_3389_fphys_2023_1138257 crossref_primary_10_1016_j_image_2025_117286 crossref_primary_10_1109_JSEN_2024_3389050 crossref_primary_10_3390_electronics14132582 crossref_primary_10_1038_s41597_022_01269_7 crossref_primary_10_1109_TPAMI_2024_3367879 crossref_primary_10_1109_TCSVT_2022_3202034 crossref_primary_10_1007_s10489_023_05122_1 crossref_primary_10_1109_TIP_2021_3086049 crossref_primary_10_1109_TGRS_2024_3368760 crossref_primary_10_3390_app131910650 crossref_primary_10_1016_j_neucom_2022_09_058 crossref_primary_10_1109_TIP_2024_3355818 crossref_primary_10_1049_ipr2_12844 crossref_primary_10_1016_j_cviu_2023_103725 crossref_primary_10_1109_TCSVT_2023_3241920 crossref_primary_10_1109_LSP_2021_3096456 crossref_primary_10_1109_TIM_2023_3287262 crossref_primary_10_1109_TMM_2024_3350917 crossref_primary_10_1109_TGRS_2025_3590753 crossref_primary_10_1002_jsid_1269 crossref_primary_10_1016_j_dsp_2025_105309 crossref_primary_10_1371_journal_pone_0309434 crossref_primary_10_1016_j_dt_2023_12_007 crossref_primary_10_1016_j_jag_2022_103019 crossref_primary_10_1186_s13040_021_00236_z crossref_primary_10_1016_j_acra_2023_01_023 crossref_primary_10_1007_s00500_024_09745_5 crossref_primary_10_1007_s11042_023_17035_9 crossref_primary_10_1016_j_patcog_2022_108909 crossref_primary_10_3390_photonics8090376 crossref_primary_10_1109_TMI_2021_3076191 crossref_primary_10_1007_s11760_023_02708_7 crossref_primary_10_1109_TCSVT_2021_3138431 crossref_primary_10_1109_TPAMI_2021_3106790 crossref_primary_10_1038_s41598_025_02503_z crossref_primary_10_1109_TIP_2022_3231741 crossref_primary_10_1016_j_asoc_2025_113010 crossref_primary_10_1016_j_anucene_2023_109820 crossref_primary_10_1016_j_image_2023_116971 crossref_primary_10_1109_TGRS_2022_3232384 crossref_primary_10_1109_TPAMI_2025_3546870 crossref_primary_10_1049_ipr2_12746 crossref_primary_10_1109_TSMC_2024_3429345 crossref_primary_10_1109_ACCESS_2024_3415420 crossref_primary_10_1109_TPAMI_2024_3471571 crossref_primary_10_1007_s11042_023_17494_0 crossref_primary_10_1002_cav_2201 crossref_primary_10_1148_radiol_221372 crossref_primary_10_1109_ACCESS_2023_3341041 crossref_primary_10_1109_TOH_2021_3078889 crossref_primary_10_1109_JSEN_2024_3429527 crossref_primary_10_1016_j_imavis_2025_105467 crossref_primary_10_1016_j_compbiomed_2023_107632 crossref_primary_10_1109_TPAMI_2021_3070382 crossref_primary_10_1137_22M1536996 crossref_primary_10_3390_rs17050935 crossref_primary_10_3390_sym15101850 crossref_primary_10_1016_j_dsp_2021_103270 crossref_primary_10_1007_s11235_022_00985_0 crossref_primary_10_1109_JBHI_2023_3341250 crossref_primary_10_1016_j_compbiomed_2023_107409 crossref_primary_10_1109_TNNLS_2020_3048031 crossref_primary_10_1016_j_knosys_2023_111156 crossref_primary_10_1109_TMI_2022_3189759 crossref_primary_10_1007_s11227_024_06646_0 crossref_primary_10_3389_frwa_2021_800369 crossref_primary_10_1007_s10489_022_04313_6 crossref_primary_10_1109_JSTARS_2024_3409808 crossref_primary_10_1148_radiol_220189 crossref_primary_10_1016_j_compbiomed_2024_108914 crossref_primary_10_1038_s41598_022_16223_1 crossref_primary_10_1109_TPAMI_2025_3545571 crossref_primary_10_3390_s20133724 crossref_primary_10_1109_ACCESS_2023_3240402 crossref_primary_10_1007_s00138_023_01416_z crossref_primary_10_1007_s11263_023_01793_y crossref_primary_10_1109_JAS_2023_123543 crossref_primary_10_3390_math11030651 crossref_primary_10_1007_s11042_024_19450_y crossref_primary_10_1007_s11760_024_03395_8 crossref_primary_10_1109_LSP_2020_3023341 crossref_primary_10_1016_j_cviu_2025_104491 crossref_primary_10_1109_TCYB_2022_3170472 crossref_primary_10_3389_fenrg_2022_1038819 crossref_primary_10_1109_TKDE_2022_3144310 crossref_primary_10_1109_TGRS_2024_3401843 crossref_primary_10_1109_TIP_2023_3341700 crossref_primary_10_1016_j_compbiomed_2022_106294 crossref_primary_10_1109_TIM_2022_3192280 crossref_primary_10_1177_01423312231164455 crossref_primary_10_1016_j_neunet_2024_106555 crossref_primary_10_1109_TGRS_2022_3201206 crossref_primary_10_1109_TCSVT_2022_3188524 crossref_primary_10_1109_TPAMI_2024_3432812 crossref_primary_10_1016_j_patrec_2022_12_021 crossref_primary_10_1016_j_knosys_2023_111116 crossref_primary_10_3233_XST_211098 crossref_primary_10_1177_14727978251360525 crossref_primary_10_3390_s23052439 crossref_primary_10_1145_3694973 crossref_primary_10_1109_ACCESS_2022_3162608 crossref_primary_10_1109_TMI_2021_3090257 crossref_primary_10_1109_TPAMI_2023_3241756 crossref_primary_10_1109_TBC_2023_3247953 crossref_primary_10_1038_s41598_022_26647_4 crossref_primary_10_1109_TPAMI_2025_3568690 crossref_primary_10_1016_j_neucom_2024_128533 crossref_primary_10_1007_s11263_025_02553_w crossref_primary_10_1016_j_jag_2024_103909 crossref_primary_10_1016_j_neunet_2024_106686 crossref_primary_10_1088_1361_6560_acc921 crossref_primary_10_1007_s11063_022_11100_4 crossref_primary_10_1016_j_patcog_2024_110929 crossref_primary_10_32604_cmes_2023_029181 crossref_primary_10_1109_TIP_2024_3403034 crossref_primary_10_1007_s10489_021_02489_x crossref_primary_10_1007_s11263_022_01705_6 crossref_primary_10_1007_s10032_024_00499_2 crossref_primary_10_1007_s11042_024_19128_5 crossref_primary_10_1109_TMM_2021_3079697 crossref_primary_10_1109_TGRS_2024_3359095 crossref_primary_10_1109_TTHZ_2021_3132160 crossref_primary_10_1007_s11042_024_18140_z crossref_primary_10_1016_j_dsp_2023_104052 crossref_primary_10_1109_TCSVT_2023_3317486 crossref_primary_10_1007_s11042_022_14194_z crossref_primary_10_1109_TCBB_2022_3205217 crossref_primary_10_3390_biomedicines11082225 crossref_primary_10_1109_JBHI_2023_3325241 crossref_primary_10_1038_s41598_024_60139_x crossref_primary_10_1007_s11042_022_13693_3 crossref_primary_10_1109_TIM_2023_3261929 crossref_primary_10_1109_TNNLS_2020_2996498 crossref_primary_10_1109_JBHI_2021_3087407 crossref_primary_10_1109_TCDS_2020_3017100 crossref_primary_10_3390_s23239575 crossref_primary_10_3390_app14177824 crossref_primary_10_1007_s00371_023_02990_z crossref_primary_10_1016_j_compbiomed_2023_107584 crossref_primary_10_1109_ACCESS_2024_3407750 crossref_primary_10_1109_TPAMI_2024_3516874 crossref_primary_10_1109_TIP_2024_3404253 crossref_primary_10_3390_electronics11152445 crossref_primary_10_1109_TGRS_2024_3357812 crossref_primary_10_1016_j_asoc_2024_112395 crossref_primary_10_1109_TIP_2023_3333564 crossref_primary_10_1016_j_cag_2023_05_005 crossref_primary_10_1016_j_eswa_2025_129065 crossref_primary_10_1109_LGRS_2023_3236691 crossref_primary_10_1109_LSP_2025_3563125 crossref_primary_10_1016_j_patcog_2021_108349 crossref_primary_10_1109_TCSVT_2022_3192099 crossref_primary_10_3390_jimaging7070103 crossref_primary_10_1109_TPAMI_2022_3167175 crossref_primary_10_1109_ACCESS_2023_3233976 crossref_primary_10_1007_s11042_022_14108_z crossref_primary_10_1109_TIP_2024_3512362 crossref_primary_10_1109_JSTARS_2023_3294619 crossref_primary_10_1016_j_neucom_2023_127066 crossref_primary_10_1016_j_engappai_2023_107051 crossref_primary_10_1007_s00371_023_02805_1 crossref_primary_10_1093_mnras_stac2437 crossref_primary_10_1109_JSTARS_2024_3356523 crossref_primary_10_1109_TBC_2021_3068862 crossref_primary_10_1016_j_ascom_2024_100822 crossref_primary_10_3390_app11177803 crossref_primary_10_1109_ACCESS_2021_3052946 crossref_primary_10_1007_s11263_023_01812_y crossref_primary_10_1007_s11760_022_02262_8 crossref_primary_10_3390_s22218127 crossref_primary_10_1016_j_neunet_2023_11_008 crossref_primary_10_3390_s23031486 crossref_primary_10_3390_a14040109 crossref_primary_10_3390_s23167044 crossref_primary_10_1016_j_patrec_2025_03_006 crossref_primary_10_1109_TNNLS_2021_3057439 crossref_primary_10_1016_j_icte_2022_07_010 crossref_primary_10_4218_etrij_2024_0295 crossref_primary_10_1016_j_eswa_2025_127905 crossref_primary_10_1080_09540091_2022_2078792 crossref_primary_10_1109_TCSVT_2023_3267133 crossref_primary_10_1007_s41870_024_02315_w crossref_primary_10_1007_s10489_021_02891_5 crossref_primary_10_1002_aisy_202200004 crossref_primary_10_1109_TPAMI_2022_3183612 crossref_primary_10_1007_s10278_023_00786_0 crossref_primary_10_1109_TIM_2024_3366569 crossref_primary_10_3390_s22218580 crossref_primary_10_1007_s10489_022_04222_8 crossref_primary_10_1016_j_ins_2022_07_122 crossref_primary_10_1016_j_patcog_2023_109602 crossref_primary_10_1007_s11760_023_02779_6 crossref_primary_10_1109_JAS_2024_124521 crossref_primary_10_1016_j_jvcir_2023_103994 crossref_primary_10_1145_3570164 crossref_primary_10_1109_TIP_2023_3242774 crossref_primary_10_1109_JPHOT_2022_3221726 crossref_primary_10_1109_TCSVT_2021_3104305 crossref_primary_10_1002_gamm_202470004 crossref_primary_10_1109_ACCESS_2024_3431527 crossref_primary_10_1364_AO_466173 crossref_primary_10_1016_j_optlaseng_2025_108973 crossref_primary_10_1016_j_undsp_2024_06_005 crossref_primary_10_1109_TCE_2025_3565390 crossref_primary_10_3390_app10175764 crossref_primary_10_3390_app13116567 crossref_primary_10_1007_s11760_023_02768_9 crossref_primary_10_1016_j_patcog_2023_109719 |
| Cites_doi | 10.1109/TIP.2003.819861 10.1007/978-3-030-01234-2_18 10.1162/neco.1989.1.4.541 10.1109/ICCV.2015.62 10.1007/s11042-016-4020-z 10.1109/ICCV.2015.73 10.1109/CVPR.2017.243 10.1109/ICCV.2001.937655 10.5201/ipol.2015.125 10.1109/ICCV.2017.514 10.1109/TPAMI.2016.2596743 10.1109/CVPR.2015.7298594 10.1007/978-3-319-10593-2_13 10.1109/TPAMI.2018.2865304 10.1109/CVPR.2015.7299156 10.1109/CVPR.2018.00259 10.1109/CVPR.2016.182 10.1109/TIP.2007.901238 10.1109/TCSVT.2019.2920407 10.1109/TIP.2012.2235847 10.1109/CVPRW.2018.00134 10.1109/TIP.2017.2662206 10.1007/978-3-319-46475-6_25 10.1109/CVPR.2016.206 10.1109/TIP.2018.2839891 10.1109/CVPR.2018.00960 10.1007/s11263-015-0808-y 10.1109/CVPR.2016.207 10.1007/978-3-642-40760-4_2 10.1109/CVPR.2018.00262 10.1109/CVPR.2016.181 10.1109/CVPRW.2018.00121 10.1109/CVPR.2016.90 10.1109/CVPR.2017.298 10.1109/CVPR.2018.00079 10.1109/CVPRW.2017.149 10.1109/ICCV.2013.241 10.5244/C.26.135 10.1109/TIP.2006.881969 10.1109/TPAMI.2018.2873610 10.1109/BTAS.2010.5634490 10.1007/978-3-642-33715-4_6 10.1007/978-3-642-33786-4_9 10.1109/TPAMI.2015.2439281 10.1109/TIP.2012.2208977 10.1109/CVPR.2017.19 10.1109/TIP.2006.877407 10.1109/CVPR.2017.300 10.1109/CVPR.2018.00344 10.1109/TIP.2007.891788 10.1109/ICCV.2015.50 10.1109/CVPRW.2017.151 10.1109/CVPR.2018.00337 10.1109/ICIP.2007.4378954 10.1109/TIP.2014.2305844 10.1109/CVPR.2015.7299003 10.1109/ICCV.2017.486 10.1109/ICCV.2017.125 10.1109/CVPR.2017.618 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2021 |
| DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| DOI | 10.1109/TPAMI.2020.2968521 |
| DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitleList | PubMed MEDLINE - Academic Technology Research Database |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://ieeexplore.ieee.org/ sourceTypes: Publisher – sequence: 3 dbid: 7X8 name: MEDLINE - Academic url: https://search.proquest.com/medline sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering Computer Science |
| EISSN | 2160-9292 1939-3539 |
| EndPage | 2495 |
| ExternalDocumentID | 31985406 10_1109_TPAMI_2020_2968521 8964437 |
| Genre | orig-research Journal Article |
| GrantInformation_xml | – fundername: Army Research Office grantid: W911NF-17-1-0367 funderid: 10.13039/100000183 |
| GroupedDBID | --- -DZ -~X .DC 0R~ 29I 4.4 53G 5GY 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT AENEX AGQYO AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P HZ~ IEDLZ IFIPE IPLJI JAVBF LAI M43 MS~ O9- OCL P2P PQQKQ RIA RIE RNS RXW TAE TN5 UHB ~02 AAYXX CITATION NPM RIC Z5M 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
| ID | FETCH-LOGICAL-c351t-4ebd59e6875bad6d4d6ce1876cd32a9b77fc006c193be905f30ae57a93d01b0a3 |
| IEDL.DBID | RIE |
| ISICitedReferencesCount | 588 |
| ISICitedReferencesURI | http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000692540900022&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| ISSN | 0162-8828 1939-3539 |
| IngestDate | Sun Nov 23 09:33:19 EST 2025 Sun Jun 29 15:19:20 EDT 2025 Wed Feb 19 02:31:09 EST 2025 Sat Nov 29 05:15:59 EST 2025 Tue Nov 18 21:01:02 EST 2025 Wed Aug 27 02:51:03 EDT 2025 |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 7 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c351t-4ebd59e6875bad6d4d6ce1876cd32a9b77fc006c193be905f30ae57a93d01b0a3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0002-2288-5079 0000-0003-3423-1539 0000-0002-5098-2853 0000-0001-6271-4082 |
| PMID | 31985406 |
| PQID | 2539352574 |
| PQPubID | 85458 |
| PageCount | 16 |
| ParticipantIDs | proquest_journals_2539352574 crossref_citationtrail_10_1109_TPAMI_2020_2968521 proquest_miscellaneous_2346283604 crossref_primary_10_1109_TPAMI_2020_2968521 ieee_primary_8964437 pubmed_primary_31985406 |
| PublicationCentury | 2000 |
| PublicationDate | 2021-07-01 |
| PublicationDateYYYYMMDD | 2021-07-01 |
| PublicationDate_xml | – month: 07 year: 2021 text: 2021-07-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States – name: New York |
| PublicationTitle | IEEE transactions on pattern analysis and machine intelligence |
| PublicationTitleAbbrev | TPAMI |
| PublicationTitleAlternate | IEEE Trans Pattern Anal Mach Intell |
| PublicationYear | 2021 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | kingma (ref69) 2014 ref12 timofte (ref13) 2014 ref15 ref58 ref14 ref52 ref55 ref11 ref10 ref17 ref16 ref19 ref18 krizhevsky (ref53) 2012 ref51 ref50 wang (ref42) 2018 glorot (ref57) 2011 ref48 ref47 ref41 ref44 ref43 ref49 mao (ref22) 2016 ref8 simonyan (ref54) 0 ref7 ref9 ref4 ref6 ref5 karras (ref3) 2018 ref35 ref34 ref37 ref36 ref75 ref74 ref30 ref33 ioffe (ref56) 2015 ref76 ref32 lee (ref31) 2015 blau (ref40) 2018 zeyde (ref59) 2010 ref2 ref1 ref39 liu (ref46) 2018 ref71 szegedy (ref28) 2017 ref70 ref73 ref72 ref68 ref24 ref67 ref23 ref26 ref25 ref64 ref20 ref63 sheikh (ref65) 2005 ref66 ref21 ref27 ref29 haris (ref38) 2018 ref60 ref62 ref61 plötz (ref45) 2018 |
| References_xml | – start-page: 2810 year: 2016 ident: ref22 article-title: Image restoration using very deep convolutional encoder-decoder networks with symmetric skip connections publication-title: Proc Int Conf Neural Inf Process – ident: ref67 doi: 10.1109/TIP.2003.819861 – ident: ref43 doi: 10.1007/978-3-030-01234-2_18 – ident: ref52 doi: 10.1162/neco.1989.1.4.541 – ident: ref75 doi: 10.1109/ICCV.2015.62 – ident: ref64 doi: 10.1007/s11042-016-4020-z – ident: ref25 doi: 10.1109/ICCV.2015.73 – ident: ref30 doi: 10.1109/CVPR.2017.243 – ident: ref60 doi: 10.1109/ICCV.2001.937655 – ident: ref76 doi: 10.5201/ipol.2015.125 – ident: ref19 doi: 10.1109/ICCV.2017.514 – ident: ref21 doi: 10.1109/TPAMI.2016.2596743 – ident: ref55 doi: 10.1109/CVPR.2015.7298594 – ident: ref15 doi: 10.1007/978-3-319-10593-2_13 – ident: ref63 doi: 10.1109/TPAMI.2018.2865304 – ident: ref17 doi: 10.1109/CVPR.2015.7299156 – ident: ref41 doi: 10.1109/CVPR.2018.00259 – ident: ref18 doi: 10.1109/CVPR.2016.182 – ident: ref71 doi: 10.1109/TIP.2007.901238 – ident: ref33 doi: 10.1109/TCSVT.2019.2920407 – ident: ref5 doi: 10.1109/TIP.2012.2235847 – ident: ref39 doi: 10.1109/CVPRW.2018.00134 – start-page: 63 year: 2018 ident: ref42 article-title: ESRGAN: Enhanced super-resolution generative adversarial networks publication-title: Proc Eur Conf Comput Vis Workshop – ident: ref23 doi: 10.1109/TIP.2017.2662206 – ident: ref48 doi: 10.1007/978-3-319-46475-6_25 – ident: ref70 doi: 10.1109/CVPR.2016.206 – ident: ref24 doi: 10.1109/TIP.2018.2839891 – start-page: 315 year: 2011 ident: ref57 article-title: Deep sparse rectifier neural networks publication-title: Proc Int Conf Artif Intell Statist – year: 2005 ident: ref65 article-title: Live image quality assessment database release 2 – ident: ref36 doi: 10.1109/CVPR.2018.00960 – ident: ref6 doi: 10.1007/s11263-015-0808-y – ident: ref49 doi: 10.1109/CVPR.2016.207 – ident: ref2 doi: 10.1007/978-3-642-40760-4_2 – ident: ref32 doi: 10.1109/CVPR.2018.00262 – ident: ref26 doi: 10.1109/CVPR.2016.181 – ident: ref44 doi: 10.1109/CVPRW.2018.00121 – ident: ref51 doi: 10.1109/CVPR.2016.90 – start-page: 4278 year: 2017 ident: ref28 article-title: Inception-v4, inception-ResNet and the impact of residual connections on learning publication-title: Proc 31st AAAI Conf Artif Intell – ident: ref47 doi: 10.1109/CVPR.2017.298 – ident: ref34 doi: 10.1109/CVPR.2018.00079 – ident: ref37 doi: 10.1109/CVPRW.2017.149 – ident: ref12 doi: 10.1109/ICCV.2013.241 – ident: ref58 doi: 10.5244/C.26.135 – ident: ref4 doi: 10.1109/TIP.2006.881969 – start-page: 562 year: 2015 ident: ref31 article-title: Deeply-supervised nets publication-title: Proc Int Conf Artif Intell Statist – year: 2018 ident: ref3 article-title: Progressive growing of GANs for improved quality, stability, and variation publication-title: Proc Int Conf Learn Representations – year: 2014 ident: ref69 article-title: Adam: A method for stochastic optimization publication-title: Proc Int Conf Learn Representations – start-page: 711 year: 2010 ident: ref59 article-title: On single image scale-up using sparse-representations publication-title: Proc 7th Int Conf Curves Surfaces – start-page: 1095 year: 2018 ident: ref45 article-title: Neural nearest neighbors networks publication-title: Proc Int Conf Neural Inf Process – year: 0 ident: ref54 article-title: Very deep convolutional networks for large-scale image recognition publication-title: in Proc Int Conf Learn Representations – ident: ref7 doi: 10.1109/TPAMI.2018.2873610 – start-page: 334 year: 2018 ident: ref40 article-title: 2018 PIRM challenge on perceptual image super-resolution publication-title: Proc Eur Conf Comput Vis Workshop – ident: ref1 doi: 10.1109/BTAS.2010.5634490 – start-page: 111 year: 2014 ident: ref13 article-title: A+: Adjusted anchored neighborhood regression for fast super-resolution publication-title: Proc IEEE Asian Conf Comput Vis – ident: ref73 doi: 10.1007/978-3-642-33715-4_6 – ident: ref68 doi: 10.1007/978-3-642-33786-4_9 – start-page: 1680 year: 2018 ident: ref46 article-title: Non-local recurrent network for image restoration publication-title: Proc Int Conf Neural Inf Process – ident: ref61 doi: 10.1109/TPAMI.2015.2439281 – ident: ref11 doi: 10.1109/TIP.2012.2208977 – ident: ref50 doi: 10.1109/CVPR.2017.19 – ident: ref10 doi: 10.1109/TIP.2006.877407 – ident: ref8 doi: 10.1109/CVPR.2017.300 – start-page: 1664 year: 2018 ident: ref38 article-title: Deep back-projection networks for super-resolution publication-title: Proc IEEE Conf Comput Vis Pattern Recognit – ident: ref20 doi: 10.1109/CVPR.2018.00344 – ident: ref66 doi: 10.1109/TIP.2007.891788 – ident: ref62 doi: 10.1109/ICCV.2015.50 – ident: ref27 doi: 10.1109/CVPRW.2017.151 – start-page: 1097 year: 2012 ident: ref53 article-title: ImageNet classification with deep convolutional neural networks publication-title: Proc Int Conf Neural Inf Process – ident: ref35 doi: 10.1109/CVPR.2018.00337 – ident: ref72 doi: 10.1109/ICIP.2007.4378954 – ident: ref14 doi: 10.1109/TIP.2014.2305844 – ident: ref16 doi: 10.1109/CVPR.2015.7299003 – ident: ref9 doi: 10.1109/ICCV.2017.486 – ident: ref74 doi: 10.1109/ICCV.2017.125 – start-page: 448 year: 2015 ident: ref56 article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift publication-title: Proc Int Conf Mach Learn – ident: ref29 doi: 10.1109/CVPR.2017.618 |
| SSID | ssj0014503 |
| Score | 2.7559896 |
| Snippet | Recently, deep convolutional neural network (CNN) has achieved great success for image restoration (IR) and provided hierarchical features at the same time.... |
| SourceID | proquest pubmed crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 2480 |
| SubjectTerms | Artificial neural networks compression artifact reduction Feature extraction hierarchical features Image coding Image compression image deblurring Image denoising Image quality Image resolution Image restoration image super-resolution Noise reduction Residual dense network Task analysis Training |
| Title | Residual Dense Network for Image Restoration |
| URI | https://ieeexplore.ieee.org/document/8964437 https://www.ncbi.nlm.nih.gov/pubmed/31985406 https://www.proquest.com/docview/2539352574 https://www.proquest.com/docview/2346283604 |
| Volume | 43 |
| WOSCitedRecordID | wos000692540900022&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2160-9292 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014503 issn: 0162-8828 databaseCode: RIE dateStart: 19790101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3rS8MwED_mEJkffMzXdI4KfnPd2qZtmo-iDgc6xpiyb6VJbyDoJnv493tJHyio4LdCr0nJvX6X3OUALqUryehjaKuUYlXyt6RzIZM2OSsfycVNA1Ph_fzAB4NoMhHDCrTLWhhENMln2NGP5iw_nau13irrRoK8N-MbsME5z2q1yhMDPzBdkAnB0GwURhQFMo7ojofXj30KBT2n44kwIodVgy0SvYjQSvjNH5kGK79jTeNzerv_-9s92MmxpXWdCcM-VHBWh92ib4OVq3Edtr9cQngA7REuTUWWdUshLVqDLDHcIjRr9d_I3Fgj033GsPAQnnp345t7O--hYCsWuCvbR5kGAkMKS2SShqmfhgpdMoEqZV4iJOdTRYqnCMdJFE4wZU6CAU8EI95JJ2FHUJ3NZ3gCFkELjSY4ykj4iUtDpomUNKqPismAN8AtVjJW-QXjus_Fa2wCDUfEhhGxZkScM6IBV-U379n1Gn9SH-hlLinzFW5As2BYnGvgMvYCXXRMBslvwEX5mnRHH4gkM5yviYbpylwWOkRznDG6HLuQj9Of5zyDmqezW0zibhOqq8Uaz2FTfaxelosWCegkahkB_QQbzNyf |
| linkProvider | IEEE |
| linkToHtml | http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1ZS8NAEB7qgceD91HPCL7ZaJLdTbKP4oHFtpRSxbeQ3UxB0FZs6-93dnOgoIJvgUx2w871ze7MDsCp8hUZfQxdnVGsSv6WdC5kyiVnxZFc3EDYCu_HVtTpxE9PsluDRlULg4g2-QzPzaM9y89Gemq2yi5iSd6bRTMwJzgP_Lxaqzoz4ML2QSYMQ_NRIFGWyHjyot-9bDcpGAy880CGMbmsJVgg4YsJr4TfPJJtsfI72rRe53b1f_-7BisFunQuc3FYhxoON2C17NzgFIq8ActfriHchEYPx7Ymy7mmoBadTp4a7hCedZqvZHCcnu0_Y5m4BQ-3N_2rO7foouBqJvyJy1FlQmJIgYlKszDjWajRJyOoMxakUkXRQJPqaUJyCqUnBsxLUUSpZMQ95aVsG2aHoyHugkPgwuCJCFUseerTkFmqFI3KUTMlojr45Uomurhi3HS6eElsqOHJxDIiMYxICkbU4az65i2_YONP6k2zzBVlscJ1OCgZlhQ6OE4CYcqOySTxOpxUr0l7zJFIOsTRlGiYqc1loUc0Ozmjq7FL-dj7ec5jWLzrt1tJq9m534elwOS62DTeA5idvE_xEOb1x-R5_H5kxfQTi8re_g |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Residual+Dense+Network+for+Image+Restoration&rft.jtitle=IEEE+transactions+on+pattern+analysis+and+machine+intelligence&rft.au=Zhang%2C+Yulun&rft.au=Tian%2C+Yapeng&rft.au=Kong%2C+Yu&rft.au=Zhong%2C+Bineng&rft.date=2021-07-01&rft.issn=1939-3539&rft.eissn=1939-3539&rft.volume=43&rft.issue=7&rft.spage=2480&rft_id=info:doi/10.1109%2FTPAMI.2020.2968521&rft.externalDBID=NO_FULL_TEXT |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0162-8828&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0162-8828&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0162-8828&client=summon |