A Sparse-View CT Reconstruction Method Based on Combination of DenseNet and Deconvolution

Sparse-view computed tomography (CT) holds great promise for speeding up data acquisition and reducing radiation dose in CT scans. Recent advances in reconstruction algorithms for sparse-view CT, such as iterative reconstruction algorithms, obtained high-quality image while requiring advanced comput...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:IEEE transactions on medical imaging Ročník 37; číslo 6; s. 1407 - 1417
Hlavní autori: Zhang, Zhicheng, Liang, Xiaokun, Dong, Xu, Xie, Yaoqin, Cao, Guohua
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: United States IEEE 01.06.2018
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Predmet:
ISSN:0278-0062, 1558-254X, 1558-254X
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Abstract Sparse-view computed tomography (CT) holds great promise for speeding up data acquisition and reducing radiation dose in CT scans. Recent advances in reconstruction algorithms for sparse-view CT, such as iterative reconstruction algorithms, obtained high-quality image while requiring advanced computing power. Lately, deep learning (DL) has been widely used in various applications and has obtained many remarkable outcomes. In this paper, we propose a new method for sparse-view CT reconstruction based on the DL approach. The method can be divided into two steps. First, filter backprojection (FBP) was used to reconstruct the CT image from sparsely sampled sinogram. Then, the FBP results were fed to a DL neural network, which is a DenseNet and deconvolution-based network (DD-Net). The DD-Net combines the advantages of DenseNet and deconvolution and applies shortcut connections to concatenate DenseNet and deconvolution to accelerate the training speed of the network; all of those operations can greatly increase the depth of network while enhancing the expression ability of the network. After the training, the proposed DD-Net achieved a competitive performance relative to the state-of-the-art methods in terms of streaking artifacts removal and structure preservation. Compared with the other state-of-the-art reconstruction methods, the DD-Net method can increase the structure similarity by up to 18% and reduce the root mean square error by up to 42%. These results indicate that DD-Net has great potential for sparse-view CT image reconstruction.
AbstractList Sparse-view computed tomography (CT) holds great promise for speeding up data acquisition and reducing radiation dose in CT scans. Recent advances in reconstruction algorithms for sparse-view CT, such as iterative reconstruction algorithms, obtained high-quality image while requiring advanced computing power. Lately, deep learning (DL) has been widely used in various applications and has obtained many remarkable outcomes. In this paper, we propose a new method for sparse-view CT reconstruction based on the DL approach. The method can be divided into two steps. First, filter backprojection (FBP) was used to reconstruct the CT image from sparsely sampled sinogram. Then, the FBP results were fed to a DL neural network, which is a DenseNet and deconvolution-based network (DD-Net). The DD-Net combines the advantages of DenseNet and deconvolution and applies shortcut connections to concatenate DenseNet and deconvolution to accelerate the training speed of the network; all of those operations can greatly increase the depth of network while enhancing the expression ability of the network. After the training, the proposed DD-Net achieved a competitive performance relative to the state-of-the-art methods in terms of streaking artifacts removal and structure preservation. Compared with the other state-of-the-art reconstruction methods, the DD-Net method can increase the structure similarity by up to 18% and reduce the root mean square error by up to 42%. These results indicate that DD-Net has great potential for sparse-view CT image reconstruction.
Sparse-view computed tomography (CT) holds great promise for speeding up data acquisition and reducing radiation dose in CT scans. Recent advances in reconstruction algorithms for sparse-view CT, such as iterative reconstruction algorithms, obtained high-quality image while requiring advanced computing power. Lately, deep learning (DL) has been widely used in various applications and has obtained many remarkable outcomes. In this paper, we propose a new method for sparse-view CT reconstruction based on the DL approach. The method can be divided into two steps. First, filter backprojection (FBP) was used to reconstruct the CT image from sparsely sampled sinogram. Then, the FBP results were fed to a DL neural network, which is a DenseNet and deconvolution-based network (DD-Net). The DD-Net combines the advantages of DenseNet and deconvolution and applies shortcut connections to concatenate DenseNet and deconvolution to accelerate the training speed of the network; all of those operations can greatly increase the depth of network while enhancing the expression ability of the network. After the training, the proposed DD-Net achieved a competitive performance relative to the state-of-the-art methods in terms of streaking artifacts removal and structure preservation. Compared with the other state-of-the-art reconstruction methods, the DD-Net method can increase the structure similarity by up to 18% and reduce the root mean square error by up to 42%. These results indicate that DD-Net has great potential for sparse-view CT image reconstruction.Sparse-view computed tomography (CT) holds great promise for speeding up data acquisition and reducing radiation dose in CT scans. Recent advances in reconstruction algorithms for sparse-view CT, such as iterative reconstruction algorithms, obtained high-quality image while requiring advanced computing power. Lately, deep learning (DL) has been widely used in various applications and has obtained many remarkable outcomes. In this paper, we propose a new method for sparse-view CT reconstruction based on the DL approach. The method can be divided into two steps. First, filter backprojection (FBP) was used to reconstruct the CT image from sparsely sampled sinogram. Then, the FBP results were fed to a DL neural network, which is a DenseNet and deconvolution-based network (DD-Net). The DD-Net combines the advantages of DenseNet and deconvolution and applies shortcut connections to concatenate DenseNet and deconvolution to accelerate the training speed of the network; all of those operations can greatly increase the depth of network while enhancing the expression ability of the network. After the training, the proposed DD-Net achieved a competitive performance relative to the state-of-the-art methods in terms of streaking artifacts removal and structure preservation. Compared with the other state-of-the-art reconstruction methods, the DD-Net method can increase the structure similarity by up to 18% and reduce the root mean square error by up to 42%. These results indicate that DD-Net has great potential for sparse-view CT image reconstruction.
Author Zhicheng Zhang
Yaoqin Xie
Xiaokun Liang
Xu Dong
Guohua Cao
Author_xml – sequence: 1
  givenname: Zhicheng
  orcidid: 0000-0002-5333-1394
  surname: Zhang
  fullname: Zhang, Zhicheng
– sequence: 2
  givenname: Xiaokun
  orcidid: 0000-0002-1207-5726
  surname: Liang
  fullname: Liang, Xiaokun
– sequence: 3
  givenname: Xu
  orcidid: 0000-0001-9669-0357
  surname: Dong
  fullname: Dong, Xu
– sequence: 4
  givenname: Yaoqin
  surname: Xie
  fullname: Xie, Yaoqin
– sequence: 5
  givenname: Guohua
  orcidid: 0000-0003-2107-7587
  surname: Cao
  fullname: Cao, Guohua
BackLink https://www.ncbi.nlm.nih.gov/pubmed/29870369$$D View this record in MEDLINE/PubMed
BookMark eNp9kc1P3DAQxS1EVXZp70hIKFIvXLId27HjHGELLRIfUrtFcLIcZyKCsvbWTor47_GySw8cehiNRvN7T6N5U7LrvENCDijMKIXq6-LqYsaAqhlTjHOudsiECqFyJoq7XTIBVqocQLI9Mo3xEYAWAqqPZI9VqgQuqwm5P8l-rUyImN92-JTNF9lPtN7FIYx26LzLrnB48E12aiI2WZrnfll3zrzufJt9QxfxGofMuCYNSfrX9-N6-4l8aE0f8fO275Pf52eL-Y_88ub7xfzkMrcF5UNeS6u4BGRSCME5q6AQKFpZCSuLtq0KWpqSQq3SyZQr0UJTC-QKClVbZoDvk-ON7yr4PyPGQS-7aLHvjUM_Rs1AUEjmFUvol3foox-DS9dpRstCUJYqUUdbaqyX2OhV6JYmPOu3pyVAbgAbfIwBW2274fUjQzBdrynodTo6paPX6ehtOkkI74Rv3v-RHG4kHSL-wxXnVEnKXwBCHJa_
CODEN ITMID4
CitedBy_id crossref_primary_10_1007_s11042_020_10079_1
crossref_primary_10_1016_j_nima_2021_165594
crossref_primary_10_1002_mp_15183
crossref_primary_10_1016_j_ndteint_2020_102347
crossref_primary_10_1016_j_eswa_2025_128109
crossref_primary_10_3233_XST_230184
crossref_primary_10_1186_s42492_024_00165_8
crossref_primary_10_3389_fnins_2020_00439
crossref_primary_10_1109_TCI_2022_3207351
crossref_primary_10_1016_j_media_2025_103585
crossref_primary_10_1016_j_ejrad_2024_111355
crossref_primary_10_1007_s11042_022_12596_7
crossref_primary_10_1109_TMI_2022_3167809
crossref_primary_10_1007_s10278_023_00831_y
crossref_primary_10_1016_j_compbiomed_2023_107161
crossref_primary_10_1016_j_media_2022_102650
crossref_primary_10_1016_j_optlaseng_2023_107873
crossref_primary_10_1016_j_radi_2024_10_009
crossref_primary_10_1038_s41551_022_00953_8
crossref_primary_10_1016_j_compbiomed_2025_109900
crossref_primary_10_1007_s11263_025_02426_2
crossref_primary_10_1088_1361_6560_ad3320
crossref_primary_10_1109_TCI_2021_3070184
crossref_primary_10_1016_j_engstruct_2023_116827
crossref_primary_10_1088_1674_1056_ac0dab
crossref_primary_10_1109_TIM_2025_3554318
crossref_primary_10_1016_j_compbiomed_2023_106888
crossref_primary_10_1109_TRPMS_2023_3316349
crossref_primary_10_1088_1361_6560_ad9dac
crossref_primary_10_1038_s41598_024_56309_6
crossref_primary_10_1109_TIM_2025_3568935
crossref_primary_10_1016_j_cmpb_2023_107802
crossref_primary_10_3390_app13106051
crossref_primary_10_3390_app12083754
crossref_primary_10_1002_mp_15884
crossref_primary_10_1016_j_mattod_2024_08_016
crossref_primary_10_3389_fevo_2024_1363423
crossref_primary_10_1002_mp_15885
crossref_primary_10_1097_RCT_0000000000001734
crossref_primary_10_1016_j_compmedimag_2024_102491
crossref_primary_10_1016_j_optlaseng_2024_108469
crossref_primary_10_1109_TCI_2020_2996751
crossref_primary_10_1109_JBHI_2024_3486726
crossref_primary_10_1109_TCI_2020_3039385
crossref_primary_10_1109_MAP_2020_3043469
crossref_primary_10_1109_JBHI_2025_3575613
crossref_primary_10_1109_TRPMS_2023_3242662
crossref_primary_10_1109_ACCESS_2022_3218374
crossref_primary_10_1109_TRPMS_2022_3222213
crossref_primary_10_1109_TRPMS_2024_3512172
crossref_primary_10_1109_TMI_2024_3439573
crossref_primary_10_1002_mp_16048
crossref_primary_10_1002_mp_14785
crossref_primary_10_1002_mp_17933
crossref_primary_10_1109_TMI_2024_3418652
crossref_primary_10_1109_TRPMS_2024_3439010
crossref_primary_10_1016_j_compbiomed_2022_105710
crossref_primary_10_1016_j_bspc_2025_108480
crossref_primary_10_3390_app14083397
crossref_primary_10_1149_1945_7111_ad69c5
crossref_primary_10_3390_app11104570
crossref_primary_10_1109_TIP_2019_2947790
crossref_primary_10_1515_bmt_2023_0581
crossref_primary_10_1109_JERM_2022_3223806
crossref_primary_10_1002_mp_14170
crossref_primary_10_1017_S0962492919000059
crossref_primary_10_1109_TIM_2025_3534224
crossref_primary_10_1016_j_media_2024_103300
crossref_primary_10_1016_j_displa_2024_102734
crossref_primary_10_1016_j_cmpb_2024_108206
crossref_primary_10_1007_s10278_024_01062_5
crossref_primary_10_1016_j_neucom_2019_10_092
crossref_primary_10_1109_TMI_2019_2910760
crossref_primary_10_1088_1361_6560_adffe0
crossref_primary_10_1007_s10489_021_02297_3
crossref_primary_10_1007_s00138_024_01568_6
crossref_primary_10_1016_j_radmeas_2021_106556
crossref_primary_10_1007_s11370_020_00341_8
crossref_primary_10_1016_j_cmpb_2024_108575
crossref_primary_10_1016_j_optcom_2023_130156
crossref_primary_10_1109_TIM_2022_3221772
crossref_primary_10_1088_1361_6501_ac9c21
crossref_primary_10_1088_2057_1976_ac87b4
crossref_primary_10_1109_TIM_2023_3318712
crossref_primary_10_1016_j_displa_2022_102166
crossref_primary_10_1109_TRPMS_2022_3168970
crossref_primary_10_3233_XST_230132
crossref_primary_10_1109_TRPMS_2021_3107454
crossref_primary_10_1109_JBHI_2019_2912935
crossref_primary_10_1109_TCI_2024_3507645
crossref_primary_10_1016_j_bspc_2024_106593
crossref_primary_10_1016_j_micron_2018_12_003
crossref_primary_10_1109_TIM_2022_3227549
crossref_primary_10_3390_diagnostics15080986
crossref_primary_10_1109_TIM_2023_3335513
crossref_primary_10_1109_TMI_2020_3025064
crossref_primary_10_1088_1361_6560_ac7bce
crossref_primary_10_1177_08953996251329214
crossref_primary_10_1109_TRPMS_2021_3133510
crossref_primary_10_1016_j_nima_2025_170524
crossref_primary_10_1002_mp_15958
crossref_primary_10_1016_j_neucom_2021_09_035
crossref_primary_10_1016_j_compmedimag_2021_101920
crossref_primary_10_1109_TVT_2022_3143585
crossref_primary_10_1109_TMI_2024_3355455
crossref_primary_10_1007_s10278_025_01639_8
crossref_primary_10_1109_TMI_2024_3483451
crossref_primary_10_1016_j_neucom_2025_130919
crossref_primary_10_1007_s00338_020_02005_6
crossref_primary_10_1016_j_mri_2022_12_008
crossref_primary_10_1088_1361_6560_abc5cc
crossref_primary_10_1016_j_radmeas_2024_107167
crossref_primary_10_1016_j_ejmp_2020_11_012
crossref_primary_10_1109_TMI_2021_3090257
crossref_primary_10_1109_TIP_2025_3534559
crossref_primary_10_1007_s10915_024_02638_7
crossref_primary_10_1109_JIOT_2024_3453964
crossref_primary_10_1007_s40305_019_00287_4
crossref_primary_10_1016_j_compmedimag_2025_102508
crossref_primary_10_1109_TCI_2023_3240078
crossref_primary_10_1002_mp_16552
crossref_primary_10_1109_TMI_2020_3033541
crossref_primary_10_3389_fphy_2023_1162456
crossref_primary_10_1016_j_nima_2025_170306
crossref_primary_10_1109_TCI_2021_3098922
crossref_primary_10_1088_1742_6596_1717_1_012070
crossref_primary_10_3390_s22041446
crossref_primary_10_1109_TRPMS_2020_3000789
crossref_primary_10_1038_s41598_025_03900_0
crossref_primary_10_1109_TMI_2024_3420411
crossref_primary_10_1088_1361_6420_ad5d0d
crossref_primary_10_1016_j_ymeth_2021_05_005
crossref_primary_10_1097_MD_0000000000031214
crossref_primary_10_1016_j_future_2020_02_054
crossref_primary_10_1016_j_patcog_2024_111233
crossref_primary_10_1109_TAP_2020_2978952
crossref_primary_10_1002_mp_13274
crossref_primary_10_1016_j_compbiomed_2023_106809
crossref_primary_10_1016_j_cmpb_2024_108010
crossref_primary_10_1080_10589759_2024_2305329
crossref_primary_10_1016_j_cmpb_2024_108376
crossref_primary_10_3934_ipi_2021045
crossref_primary_10_1109_JBHI_2022_3225697
crossref_primary_10_1177_08953996241313121
crossref_primary_10_1088_2632_2153_ad370e
crossref_primary_10_1016_j_media_2024_103334
crossref_primary_10_3390_e24070876
crossref_primary_10_1016_j_zemedi_2018_12_003
crossref_primary_10_1007_s00521_023_08847_9
crossref_primary_10_1088_1361_6560_ad3c91
crossref_primary_10_1007_s40747_022_00724_7
crossref_primary_10_1007_s10278_025_01390_0
crossref_primary_10_1186_s41747_024_00450_4
crossref_primary_10_1109_TIP_2020_2975718
crossref_primary_10_1016_j_dsp_2022_103450
crossref_primary_10_1016_j_compbiomed_2025_109853
crossref_primary_10_1080_10589759_2024_2334431
crossref_primary_10_1109_TMI_2022_3148110
crossref_primary_10_1002_mp_15119
crossref_primary_10_1109_TCAD_2023_3296370
crossref_primary_10_1002_mp_14828
crossref_primary_10_1016_j_compbiomed_2023_107345
crossref_primary_10_1109_TMI_2021_3078067
crossref_primary_10_1080_09205071_2022_2113444
crossref_primary_10_1016_j_eswa_2024_125099
crossref_primary_10_1109_JBHI_2022_3201232
crossref_primary_10_1016_j_neucom_2019_09_087
crossref_primary_10_1109_TMI_2023_3344712
crossref_primary_10_1007_s12194_022_00661_7
crossref_primary_10_1109_TMI_2021_3085839
crossref_primary_10_1007_s11517_022_02631_y
crossref_primary_10_1016_j_imu_2022_100911
crossref_primary_10_1109_TBME_2020_3041571
crossref_primary_10_1109_TCI_2023_3281196
crossref_primary_10_1016_j_dsp_2022_103566
crossref_primary_10_1088_1361_6560_ab831a
crossref_primary_10_1088_1361_6560_ac195c
crossref_primary_10_1007_s10921_025_01244_3
crossref_primary_10_1016_j_irbm_2021_12_004
crossref_primary_10_1016_j_media_2019_03_013
crossref_primary_10_1016_j_compbiomed_2023_107473
crossref_primary_10_1177_08953996251319183
crossref_primary_10_3390_ai5010016
crossref_primary_10_1007_s11220_025_00584_8
crossref_primary_10_1109_TCI_2023_3328278
crossref_primary_10_1088_2632_2153_ad1b8e
crossref_primary_10_1016_j_neucom_2024_127803
crossref_primary_10_3390_ijgi8110483
crossref_primary_10_1016_j_compbiomed_2023_106710
crossref_primary_10_1016_j_compbiomed_2021_104775
crossref_primary_10_1109_TMI_2024_3376414
crossref_primary_10_1007_s11760_024_03295_x
crossref_primary_10_1088_1361_6560_ab18db
crossref_primary_10_1016_j_artmed_2021_102020
crossref_primary_10_1088_1361_6501_ad11cd
crossref_primary_10_1016_j_cageo_2024_105783
crossref_primary_10_1364_AO_404276
crossref_primary_10_1088_1361_6560_ad8da2
crossref_primary_10_1109_TNNLS_2022_3169569
crossref_primary_10_1109_TIM_2022_3221136
crossref_primary_10_1109_TMI_2022_3175529
crossref_primary_10_1016_j_artmed_2023_102609
crossref_primary_10_1109_TMI_2021_3066318
crossref_primary_10_1155_2022_2692301
crossref_primary_10_1007_s13042_020_01194_4
crossref_primary_10_1088_1742_6596_1651_1_012184
crossref_primary_10_1177_08953996241300016
crossref_primary_10_1109_ACCESS_2024_3417928
crossref_primary_10_1109_TRPMS_2020_3011413
crossref_primary_10_3390_ani12213000
crossref_primary_10_1109_TMI_2021_3081824
crossref_primary_10_1080_10589759_2023_2170374
crossref_primary_10_1109_TMI_2025_3541491
crossref_primary_10_1002_mrm_28289
crossref_primary_10_1109_TAP_2022_3159680
crossref_primary_10_1016_j_cmpb_2022_107168
crossref_primary_10_1007_s13534_025_00478_4
crossref_primary_10_1109_TIM_2024_3522391
crossref_primary_10_1186_s12880_025_01910_y
crossref_primary_10_1016_j_bspc_2023_104964
crossref_primary_10_3390_photonics9030186
crossref_primary_10_1109_TUFFC_2020_2977210
crossref_primary_10_1109_TMI_2025_3557243
crossref_primary_10_1007_s10278_022_00685_w
crossref_primary_10_1016_j_neucom_2019_11_068
crossref_primary_10_1038_s41524_025_01724_0
crossref_primary_10_3390_s22093228
crossref_primary_10_32604_cmc_2022_027307
crossref_primary_10_1016_j_nima_2022_166428
crossref_primary_10_1088_1361_6560_ad360a
crossref_primary_10_1109_JPHOT_2023_3339148
crossref_primary_10_1080_10589759_2023_2195646
crossref_primary_10_1049_ipr2_12048
crossref_primary_10_1109_JBHI_2020_2977224
crossref_primary_10_1109_TBME_2019_2931195
crossref_primary_10_1002_mp_16405
crossref_primary_10_1016_j_bspc_2024_107058
crossref_primary_10_1088_1361_6560_ab4e49
crossref_primary_10_1038_s41598_025_02133_5
crossref_primary_10_1088_1361_6560_ac4123
crossref_primary_10_3390_diagnostics11081373
crossref_primary_10_1049_ipr2_12713
crossref_primary_10_1016_j_media_2021_102289
crossref_primary_10_1016_j_asoc_2021_107466
Cites_doi 10.1109/CVPR.2016.91
10.1109/TMI.2014.2350962
10.1088/0031-9155/55/22/001
10.1088/0031-9155/61/18/6878
10.1109/TMI.2015.2508780
10.1109/TPAMI.2015.2481418
10.1109/TMI.2014.2319055
10.1109/TCI.2016.2644865
10.1118/1.2836423
10.1259/0007-1285-46-552-1016
10.1109/TSP.2006.881199
10.1109/CVPR.2016.90
10.1016/j.compmedimag.2008.05.005
10.1016/S0168-9002(99)01453-9
10.1016/j.ijleo.2014.01.003
10.1109/TMI.2012.2195669
10.1038/nature14539
10.1109/TMI.2014.2336860
10.1148/radiol.2231012100
10.1118/1.2836950
10.1063/1.1713127
10.1109/TIT.2005.862083
10.1109/ACCESS.2016.2624938
10.1118/1.595715
10.1109/TSMC.1973.4309314
10.1109/TMI.2015.2498148
10.1109/TIP.2017.2713099
10.1109/ISBI.2016.7493333
10.1016/0893-6080(89)90020-8
10.1109/TIP.2003.819861
10.1088/0031-9155/53/17/021
10.1109/ICCV.2015.178
10.1117/12.2254244
10.1056/NEJMra072149
10.1109/TMI.2017.2715284
10.1088/0031-9155/59/12/2997
10.1148/radiol.2303021726
10.1016/0893-6080(88)90469-8
10.1109/TMI.2006.882141
10.1109/TIT.2006.871582
10.1137/0143028
10.1016/j.ejmp.2012.01.003
ContentType Journal Article
Copyright Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
Copyright_xml – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2018
DBID 97E
RIA
RIE
AAYXX
CITATION
CGR
CUY
CVF
ECM
EIF
NPM
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
DOI 10.1109/TMI.2018.2823338
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
Aluminium Industry Abstracts
Biotechnology Research Abstracts
Ceramic Abstracts
Computer and Information Systems Abstracts
Corrosion Abstracts
Electronics & Communications Abstracts
Engineered Materials Abstracts
Materials Business File
Mechanical & Transportation Engineering Abstracts
Solid State and Superconductivity Abstracts
METADEX
Technology Research Database
ANTE: Abstracts in New Technology & Engineering
Engineering Research Database
Aerospace Database
Materials Research Database
ProQuest Computer Science Collection
Civil Engineering Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
Nursing & Allied Health Premium
Biotechnology and BioEngineering Abstracts
MEDLINE - Academic
DatabaseTitle CrossRef
MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
Materials Research Database
Civil Engineering Abstracts
Aluminium Industry Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Mechanical & Transportation Engineering Abstracts
Electronics & Communications Abstracts
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Ceramic Abstracts
Materials Business File
METADEX
Biotechnology and BioEngineering Abstracts
Computer and Information Systems Abstracts Professional
Aerospace Database
Nursing & Allied Health Premium
Engineered Materials Abstracts
Biotechnology Research Abstracts
Solid State and Superconductivity Abstracts
Engineering Research Database
Corrosion Abstracts
Advanced Technologies Database with Aerospace
ANTE: Abstracts in New Technology & Engineering
MEDLINE - Academic
DatabaseTitleList MEDLINE
Materials Research Database

MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: RIE
  name: IEEE Xplore
  url: https://ieeexplore.ieee.org/
  sourceTypes: Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
Engineering
EISSN 1558-254X
EndPage 1417
ExternalDocumentID 29870369
10_1109_TMI_2018_2823338
8331861
Genre orig-research
Research Support, U.S. Gov't, Non-P.H.S
Research Support, Non-U.S. Gov't
Journal Article
GrantInformation_xml – fundername: Union of Production, Study and Research Project of Guangdong Province
  grantid: 2015B090901039
– fundername: Technological Breakthrough Project of Shenzhen City
  grantid: JSGG20160229203812944
– fundername: UCAS Joint PhD Training Program
– fundername: Dr. Guohua Cao’s CAREER Award from the U.S. National Science Foundation
  grantid: CBET 1351936
  funderid: 10.13039/100000001
– fundername: National Key Research and Develop Program of China
  grantid: 2016YFC0105102
– fundername: Natural Science Foundation of Guangdong Province
  grantid: 2014A030312006
  funderid: 10.13039/501100003453
GroupedDBID ---
-DZ
-~X
.GJ
0R~
29I
4.4
53G
5GY
5RE
5VS
6IK
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFO
ACGFS
ACIWK
ACNCT
ACPRK
AENEX
AETIX
AFRAH
AGQYO
AGSQL
AHBIQ
AI.
AIBXA
AKJIK
AKQYR
ALLEH
ALMA_UNASSIGNED_HOLDINGS
ASUFR
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CS3
DU5
EBS
EJD
F5P
HZ~
H~9
IBMZZ
ICLAB
IFIPE
IFJZH
IPLJI
JAVBF
LAI
M43
MS~
O9-
OCL
P2P
PQQKQ
RIA
RIE
RNS
RXW
TAE
TN5
VH1
AAYXX
CITATION
AAYOK
CGR
CUY
CVF
ECM
EIF
NPM
PKN
RIG
Z5M
7QF
7QO
7QQ
7SC
7SE
7SP
7SR
7TA
7TB
7U5
8BQ
8FD
F28
FR3
H8D
JG9
JQ2
KR7
L7M
L~C
L~D
NAPCQ
P64
7X8
ID FETCH-LOGICAL-c413t-b6c8360e265553329045e5f695c64ff9417a710b89871385f0db5e38048bc2a03
IEDL.DBID RIE
ISICitedReferencesCount 339
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000434302700011&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 0278-0062
1558-254X
IngestDate Sat Sep 27 18:12:04 EDT 2025
Sun Nov 30 04:33:21 EST 2025
Wed Feb 19 02:36:15 EST 2025
Tue Nov 18 22:00:28 EST 2025
Sat Nov 29 05:13:55 EST 2025
Wed Aug 27 02:58:06 EDT 2025
IsPeerReviewed false
IsScholarly true
Issue 6
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
https://doi.org/10.15223/policy-029
https://doi.org/10.15223/policy-037
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c413t-b6c8360e265553329045e5f695c64ff9417a710b89871385f0db5e38048bc2a03
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ORCID 0000-0003-2107-7587
0000-0002-5333-1394
0000-0002-1207-5726
0000-0001-9669-0357
PMID 29870369
PQID 2174512451
PQPubID 85460
PageCount 11
ParticipantIDs crossref_citationtrail_10_1109_TMI_2018_2823338
pubmed_primary_29870369
crossref_primary_10_1109_TMI_2018_2823338
proquest_journals_2174512451
ieee_primary_8331861
proquest_miscellaneous_2051065592
PublicationCentury 2000
PublicationDate 2018-06-01
PublicationDateYYYYMMDD 2018-06-01
PublicationDate_xml – month: 06
  year: 2018
  text: 2018-06-01
  day: 01
PublicationDecade 2010
PublicationPlace United States
PublicationPlace_xml – name: United States
– name: New York
PublicationTitle IEEE transactions on medical imaging
PublicationTitleAbbrev TMI
PublicationTitleAlternate IEEE Trans Med Imaging
PublicationYear 2018
Publisher IEEE
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Publisher_xml – name: IEEE
– name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
References ref13
ref56
ref12
ref59
ref15
ref14
ref53
ref52
ref10
hong (ref49) 2015
huang (ref39) 2016
xu (ref40) 2014
nair (ref43) 2010
ref17
ref16
(ref58) 1999
cheng (ref33) 2017
radford (ref62) 2015
ref51
ref50
brenner (ref5) 2007; 357
xu (ref8) 2012; 31
lee (ref18) 2017; 10133
ref48
ref47
ref41
sidky (ref11) 2006; 14
ref7
ref9
ref4
ref3
ref6
(ref57) 2014
ronneberger (ref36) 2015
ref35
ref37
krizhevsky (ref19) 2012
wang (ref63) 2017
ref32
ref2
ref1
kingma (ref54) 2014
ref38
he (ref46) 2016
srivastava (ref44) 2015
ref24
ref23
ref26
ref25
ref20
ioffe (ref42) 2015
ref22
ref21
grant (ref28) 2010
larsson (ref45) 2016
ref27
ref29
abadi (ref55) 2016
ref60
analysis (ref34) 2016
ref61
lecun (ref30) 2015; 521
würfl (ref31) 2016
References_xml – year: 1999
  ident: ref58
  publication-title: National Biomedical Imaging Archive (NBIA)
– year: 2014
  ident: ref54
  publication-title: Adam A method for stochastic optimization
– ident: ref48
  doi: 10.1109/CVPR.2016.91
– year: 2017
  ident: ref63
  publication-title: Perceptual Adversarial Networks for Image-to-Image Transformation
– ident: ref27
  doi: 10.1109/TMI.2014.2350962
– ident: ref14
  doi: 10.1088/0031-9155/55/22/001
– start-page: 807
  year: 2010
  ident: ref43
  article-title: Rectified linear units improve restricted Boltzmann machines
  publication-title: Proc 27th Int Conf Mach Learn
– ident: ref24
  doi: 10.1088/0031-9155/61/18/6878
– ident: ref26
  doi: 10.1109/TMI.2015.2508780
– start-page: 1790
  year: 2014
  ident: ref40
  article-title: Deep convolutional neural network for image deconvolution
  publication-title: Proc Adv Neural Inf Process Syst
– year: 2014
  ident: ref57
  publication-title: The Cancer Imaging Archive (TCIA)
– ident: ref50
  doi: 10.1109/TPAMI.2015.2481418
– ident: ref23
  doi: 10.1109/TMI.2014.2319055
– ident: ref53
  doi: 10.1109/TCI.2016.2644865
– ident: ref12
  doi: 10.1118/1.2836423
– start-page: 448
  year: 2015
  ident: ref42
  article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift
  publication-title: Proc Int Conf Mach Learn
– ident: ref2
  doi: 10.1259/0007-1285-46-552-1016
– ident: ref17
  doi: 10.1109/TSP.2006.881199
– year: 2015
  ident: ref62
  publication-title: Unsupervised Representation learning with deep convolutional generative adversarial networks CoRR
– ident: ref37
  doi: 10.1109/CVPR.2016.90
– ident: ref60
  doi: 10.1016/j.compmedimag.2008.05.005
– volume: 14
  start-page: 119
  year: 2006
  ident: ref11
  article-title: Accurate image reconstruction from few-views and limited-angle data in divergent-beam CT
  publication-title: J X-Ray Sci Technol
– ident: ref32
  doi: 10.1016/S0168-9002(99)01453-9
– ident: ref16
  doi: 10.1016/j.ijleo.2014.01.003
– volume: 31
  start-page: 1682
  year: 2012
  ident: ref8
  article-title: Low-dose X-ray CT reconstruction via dictionary learning
  publication-title: IEEE Trans Med Imaging
  doi: 10.1109/TMI.2012.2195669
– start-page: 432
  year: 2016
  ident: ref31
  article-title: Deep learning computed tomography
  publication-title: Proc Int Conf Med Image Comput Comput Assist Intervent
– volume: 521
  start-page: 436
  year: 2015
  ident: ref30
  article-title: Deep learning
  publication-title: Nature
  doi: 10.1038/nature14539
– year: 2010
  ident: ref28
  publication-title: Iterative reconstruction in image space (IRIS)
– ident: ref9
  doi: 10.1109/TMI.2014.2336860
– start-page: 234
  year: 2015
  ident: ref36
  article-title: U-Net: Convolutional networks for biomedical image segmentation
  publication-title: Proc Int Conf Med Image Comput Comput Assist Intervent
– ident: ref7
  doi: 10.1148/radiol.2231012100
– year: 2016
  ident: ref45
  publication-title: Fractalnet Ultra-deep neural networks without residuals
– year: 2016
  ident: ref34
  publication-title: Deep residual learning for compressed sensing ct reconstruction via persistent homology analysis
– ident: ref3
  doi: 10.1118/1.2836950
– start-page: 1495
  year: 2015
  ident: ref49
  article-title: Decoupled deep neural network for semi-supervised semantic segmentation
  publication-title: Proc NIPS
– ident: ref1
  doi: 10.1063/1.1713127
– ident: ref22
  doi: 10.1109/TIT.2005.862083
– start-page: 2377
  year: 2015
  ident: ref44
  article-title: Training very deep networks
  publication-title: Proc NIPS
– ident: ref38
  doi: 10.1109/ACCESS.2016.2624938
– start-page: 630
  year: 2016
  ident: ref46
  article-title: Identity mappings in deep residual networks
  publication-title: Proc Eur Conf Comput Vis
– start-page: 715
  year: 2017
  ident: ref33
  article-title: Accelerated iterative image reconstruction using a deep learning based leapfrogging strategy
  publication-title: Proc Int Conf Fully Three-Dimensional Image Reconstruct Radiol Nucl Med
– ident: ref56
  doi: 10.1118/1.595715
– ident: ref59
  doi: 10.1109/TSMC.1973.4309314
– ident: ref61
  doi: 10.1109/TMI.2015.2498148
– ident: ref35
  doi: 10.1109/TIP.2017.2713099
– ident: ref29
  doi: 10.1109/ISBI.2016.7493333
– ident: ref41
  doi: 10.1016/0893-6080(89)90020-8
– year: 2016
  ident: ref39
  publication-title: Densely Connected Convolutional Networks
– ident: ref52
  doi: 10.1109/TIP.2003.819861
– ident: ref13
  doi: 10.1088/0031-9155/53/17/021
– ident: ref47
  doi: 10.1109/ICCV.2015.178
– volume: 10133
  start-page: 1013328
  year: 2017
  ident: ref18
  article-title: View-interpolation of sparsely sampled sinogram using convolutional neural network
  publication-title: Proc SPIE
  doi: 10.1117/12.2254244
– volume: 357
  start-page: 2277
  year: 2007
  ident: ref5
  article-title: Computed tomography-An increasing source of radiation exposure
  publication-title: New England J Med
  doi: 10.1056/NEJMra072149
– ident: ref4
  doi: 10.1109/TMI.2017.2715284
– ident: ref25
  doi: 10.1088/0031-9155/59/12/2997
– ident: ref6
  doi: 10.1148/radiol.2303021726
– ident: ref51
  doi: 10.1016/0893-6080(88)90469-8
– ident: ref10
  doi: 10.1109/TMI.2006.882141
– year: 2016
  ident: ref55
  publication-title: Tensorflow Large-scale machine learning on heterogeneous distributed systems
– start-page: 1097
  year: 2012
  ident: ref19
  article-title: ImageNet classification with deep convolutional neural networks
  publication-title: Proc NIPS
– ident: ref21
  doi: 10.1109/TIT.2006.871582
– ident: ref15
  doi: 10.1137/0143028
– ident: ref20
  doi: 10.1016/j.ejmp.2012.01.003
SSID ssj0014509
Score 2.6814907
Snippet Sparse-view computed tomography (CT) holds great promise for speeding up data acquisition and reducing radiation dose in CT scans. Recent advances in...
SourceID proquest
pubmed
crossref
ieee
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1407
SubjectTerms Algorithms
Computed tomography
CT reconstruction
Data acquisition
Databases, Factual
Deconvolution
Deep Learning
DenseNet
Humans
Image processing
Image Processing, Computer-Assisted - methods
Image quality
Image reconstruction
Iterative methods
Machine learning
Medical imaging
Neural networks
Preservation
Radiation
Radiation dosage
Radiography, Thoracic
Reconstruction algorithms
Sparse-view CT
State of the art
Tomography, X-Ray Computed - methods
Training
X-ray imaging
Title A Sparse-View CT Reconstruction Method Based on Combination of DenseNet and Deconvolution
URI https://ieeexplore.ieee.org/document/8331861
https://www.ncbi.nlm.nih.gov/pubmed/29870369
https://www.proquest.com/docview/2174512451
https://www.proquest.com/docview/2051065592
Volume 37
WOSCitedRecordID wos000434302700011&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVIEE
  databaseName: IEEE Xplore
  customDbUrl:
  eissn: 1558-254X
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0014509
  issn: 0278-0062
  databaseCode: RIE
  dateStart: 19820101
  isFulltext: true
  titleUrlDefault: https://ieeexplore.ieee.org/
  providerName: IEEE
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1ZS8QwEB50EdEH76NeRPBFsG7vJo-eKLiL4CrrU2naKSxIV3RX_74zabcqqOBDIaFpGzKTyXydC-BA5zoNCu3Yihp2EGJgpy56doq-p103LdCERz_cxN2u7PfV7RQcNbEwiGicz_CYm8aWnw-zMf8qa0ufOJCxznQcx1WsVmMxCMLKncPjjLFO5E1Mko5q9zrX7MMljwle-ATJOAEwQW2S3erbaWTKq_yuaZoT53Lxf3NdgoVasxQnFSsswxSWKzD_Jd_gCsx2akv6KjyeiLtnArVoPwzwXZz1BAPRz3SyomNKS4tTOuVyQX0SHASiDR3FsBDnBH-xiyORljl12Hm9ZuI1uL-86J1d2XWZBTujE2xk6yjjSA70ojAk5c9TpOVhWEQqzKKgKFTgxinpIVrSmrm-DAsn1yH6kva-zrzU8dehVQ5L3ASBuSZ1IVMSCxmgdhSJ9dhXUnLdsSh2LWhPljvJ6hzkXArjKTFYxFEJ0SphWiU1rSw4bJ54rvJv_DF2lenQjKtJYMHOhKJJvUFfE0ZipOvQZcF-c5u2FttL0hKHYxrDAosWRXkWbFSc0Lx7wkBbP39zG-Z4ZpVP2Q60iHi4CzPZ22jw-rJH_NuXe4Z_PwA3pOfn
linkProvider IEEE
linkToHtml http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3dS9xAEB9ES60PrV_VtH5swRfBeMlmN7f7aG1F8e4oeIo-hWwyAUFyomf773dmk0sr2IIPgV2ySZad2dn5Zb4A9lzpclW5KLTUCJVGFeYxyjDHRLo4ziv04dFXg_5oZK6v7Y85OOhiYRDRO5_hITe9Lb-cFE_8q6xnEuJAxjoLWikZN9Fanc1A6cahQ3LO2CiVM6NkZHvj4Rl7cZlDAhgJgTJOAUxgm6S3fXYe-QIr_9Y1_Zlz8uF1s12G961uKY4aZliBOaxXYemvjIOr8HbY2tLX4OZIXNwTrMXw6hZ_ieOxYCj6J6GsGPri0uIrnXOloD6JDoLRnpJiUolvBIBxhFOR1yV12H29ZeN1uDz5Pj4-DdtCC2FBZ9g0dGnBsRwoU61J_ZOW9DzUVWp1kaqqsiru56SJOENrFidGV1HpNCaGdr8rZB4lH2G-ntS4CQJLRwpDYQ1WRqGLLAn2fmKN4cpjaT8OoDdb7qxos5BzMYy7zKORyGZEq4xplbW0CmC_e-K-ycDxn7FrTIduXEuCALZmFM3aLfqYMRYjbYeuAL50t2lzscUkr3HyRGNYZNGiWBnARsMJ3btnDPTp5W_uwuLpeDjIBmej88_wjmfZeJhtwTwRErfhTfFzevv4sOO5-Df54-pG
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Sparse-View+CT+Reconstruction+Method+Based+on+Combination+of+DenseNet+and+Deconvolution&rft.jtitle=IEEE+transactions+on+medical+imaging&rft.au=Zhang%2C+Zhicheng&rft.au=Liang%2C+Xiaokun&rft.au=Xu%2C+Dong&rft.au=Xie%2C+Yaoqin&rft.date=2018-06-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0278-0062&rft.eissn=1558-254X&rft.volume=37&rft.issue=6&rft.spage=1407&rft_id=info:doi/10.1109%2FTMI.2018.2823338&rft.externalDBID=NO_FULL_TEXT
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0278-0062&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0278-0062&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0278-0062&client=summon