Automatic Semantic Segmentation of Brain Gliomas from MRI Images Using a Deep Cascaded Neural Network

Brain tumors can appear anywhere in the brain and have vastly different sizes and morphology. Additionally, these tumors are often diffused and poorly contrasted. Consequently, the segmentation of brain tumor and intratumor subregions using magnetic resonance imaging (MRI) data with minimal human in...

Full description

Saved in:
Bibliographic Details
Published in:Journal of healthcare engineering Vol. 2018; no. 2018; pp. 1 - 14
Main Authors: Liu, Chang, Jiang, Jingfeng, Mao, Lei, Cui, Shaoguo, Xiong, Shuyu
Format: Journal Article
Language:English
Published: Cairo, Egypt Hindawi Publishing Corporation 01.01.2018
Hindawi
John Wiley & Sons, Inc
Subjects:
ISSN:2040-2295, 2040-2309
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Brain tumors can appear anywhere in the brain and have vastly different sizes and morphology. Additionally, these tumors are often diffused and poorly contrasted. Consequently, the segmentation of brain tumor and intratumor subregions using magnetic resonance imaging (MRI) data with minimal human interventions remains a challenging task. In this paper, we present a novel fully automatic segmentation method from MRI data containing in vivo brain gliomas. This approach can not only localize the entire tumor region but can also accurately segment the intratumor structure. The proposed work was based on a cascaded deep learning convolutional neural network consisting of two subnetworks: (1) a tumor localization network (TLN) and (2) an intratumor classification network (ITCN). The TLN, a fully convolutional network (FCN) in conjunction with the transfer learning technology, was used to first process MRI data. The goal of the first subnetwork was to define the tumor region from an MRI slice. Then, the ITCN was used to label the defined tumor region into multiple subregions. Particularly, ITCN exploited a convolutional neural network (CNN) with deeper architecture and smaller kernel. The proposed approach was validated on multimodal brain tumor segmentation (BRATS 2015) datasets, which contain 220 high-grade glioma (HGG) and 54 low-grade glioma (LGG) cases. Dice similarity coefficient (DSC), positive predictive value (PPV), and sensitivity were used as evaluation metrics. Our experimental results indicated that our method could obtain the promising segmentation results and had a faster segmentation speed. More specifically, the proposed method obtained comparable and overall better DSC values (0.89, 0.77, and 0.80) on the combined (HGG + LGG) testing set, as compared to other methods reported in the literature. Additionally, the proposed approach was able to complete a segmentation task at a rate of 1.54 seconds per slice.
AbstractList Brain tumors can appear anywhere in the brain and have vastly different sizes and morphology. Additionally, these tumors are often diffused and poorly contrasted. Consequently, the segmentation of brain tumor and intratumor subregions using magnetic resonance imaging (MRI) data with minimal human interventions remains a challenging task. In this paper, we present a novel fully automatic segmentation method from MRI data containing in vivo brain gliomas. This approach can not only localize the entire tumor region but can also accurately segment the intratumor structure. The proposed work was based on a cascaded deep learning convolutional neural network consisting of two subnetworks: (1) a tumor localization network (TLN) and (2) an intratumor classification network (ITCN). The TLN, a fully convolutional network (FCN) in conjunction with the transfer learning technology, was used to first process MRI data. The goal of the first subnetwork was to define the tumor region from an MRI slice. Then, the ITCN was used to label the defined tumor region into multiple subregions. Particularly, ITCN exploited a convolutional neural network (CNN) with deeper architecture and smaller kernel. The proposed approach was validated on multimodal brain tumor segmentation (BRATS 2015) datasets, which contain 220 high-grade glioma (HGG) and 54 low-grade glioma (LGG) cases. Dice similarity coefficient (DSC), positive predictive value (PPV), and sensitivity were used as evaluation metrics. Our experimental results indicated that our method could obtain the promising segmentation results and had a faster segmentation speed. More specifically, the proposed method obtained comparable and overall better DSC values (0.89, 0.77, and 0.80) on the combined (HGG + LGG) testing set, as compared to other methods reported in the literature. Additionally, the proposed approach was able to complete a segmentation task at a rate of 1.54 seconds per slice.
Brain tumors can appear anywhere in the brain and have vastly different sizes and morphology. Additionally, these tumors are often diffused and poorly contrasted. Consequently, the segmentation of brain tumor and intratumor subregions using magnetic resonance imaging (MRI) data with minimal human interventions remains a challenging task. In this paper, we present a novel fully automatic segmentation method from MRI data containing in vivo brain gliomas. This approach can not only localize the entire tumor region but can also accurately segment the intratumor structure. The proposed work was based on a cascaded deep learning convolutional neural network consisting of two subnetworks: (1) a tumor localization network (TLN) and (2) an intratumor classification network (ITCN). The TLN, a fully convolutional network (FCN) in conjunction with the transfer learning technology, was used to first process MRI data. The goal of the first subnetwork was to define the tumor region from an MRI slice. Then, the ITCN was used to label the defined tumor region into multiple subregions. Particularly, ITCN exploited a convolutional neural network (CNN) with deeper architecture and smaller kernel. The proposed approach was validated on multimodal brain tumor segmentation (BRATS 2015) datasets, which contain 220 high-grade glioma (HGG) and 54 low-grade glioma (LGG) cases. Dice similarity coefficient (DSC), positive predictive value (PPV), and sensitivity were used as evaluation metrics. Our experimental results indicated that our method could obtain the promising segmentation results and had a faster segmentation speed. More specifically, the proposed method obtained comparable and overall better DSC values (0.89, 0.77, and 0.80) on the combined (HGG + LGG) testing set, as compared to other methods reported in the literature. Additionally, the proposed approach was able to complete a segmentation task at a rate of 1.54 seconds per slice.
Brain tumors can appear anywhere in the brain and have vastly different sizes and morphology. Additionally, these tumors are often diffused and poorly contrasted. Consequently, the segmentation of brain tumor and intratumor subregions using magnetic resonance imaging (MRI) data with minimal human interventions remains a challenging task. In this paper, we present a novel fully automatic segmentation method from MRI data containing brain gliomas. This approach can not only localize the entire tumor region but can also accurately segment the intratumor structure. The proposed work was based on a cascaded deep learning convolutional neural network consisting of two subnetworks: (1) a tumor localization network (TLN) and (2) an intratumor classification network (ITCN). The TLN, a fully convolutional network (FCN) in conjunction with the transfer learning technology, was used to first process MRI data. The goal of the first subnetwork was to define the tumor region from an MRI slice. Then, the ITCN was used to label the defined tumor region into multiple subregions. Particularly, ITCN exploited a convolutional neural network (CNN) with deeper architecture and smaller kernel. The proposed approach was validated on multimodal brain tumor segmentation (BRATS 2015) datasets, which contain 220 high-grade glioma (HGG) and 54 low-grade glioma (LGG) cases. Dice similarity coefficient (DSC), positive predictive value (PPV), and sensitivity were used as evaluation metrics. Our experimental results indicated that our method could obtain the promising segmentation results and had a faster segmentation speed. More specifically, the proposed method obtained comparable and overall better DSC values (0.89, 0.77, and 0.80) on the combined (HGG + LGG) testing set, as compared to other methods reported in the literature. Additionally, the proposed approach was able to complete a segmentation task at a rate of 1.54 seconds per slice.
Brain tumors can appear anywhere in the brain and have vastly different sizes and morphology. Additionally, these tumors are often diffused and poorly contrasted. Consequently, the segmentation of brain tumor and intratumor subregions using magnetic resonance imaging (MRI) data with minimal human interventions remains a challenging task. In this paper, we present a novel fully automatic segmentation method from MRI data containing in vivo brain gliomas. This approach can not only localize the entire tumor region but can also accurately segment the intratumor structure. The proposed work was based on a cascaded deep learning convolutional neural network consisting of two subnetworks: (1) a tumor localization network (TLN) and (2) an intratumor classification network (ITCN). The TLN, a fully convolutional network (FCN) in conjunction with the transfer learning technology, was used to first process MRI data. The goal of the first subnetwork was to define the tumor region from an MRI slice. Then, the ITCN was used to label the defined tumor region into multiple subregions. Particularly, ITCN exploited a convolutional neural network (CNN) with deeper architecture and smaller kernel. The proposed approach was validated on multimodal brain tumor segmentation (BRATS 2015) datasets, which contain 220 high-grade glioma (HGG) and 54 low-grade glioma (LGG) cases. Dice similarity coefficient (DSC), positive predictive value (PPV), and sensitivity were used as evaluation metrics. Our experimental results indicated that our method could obtain the promising segmentation results and had a faster segmentation speed. More specifically, the proposed method obtained comparable and overall better DSC values (0.89, 0.77, and 0.80) on the combined (HGG + LGG) testing set, as compared to other methods reported in the literature. Additionally, the proposed approach was able to complete a segmentation task at a rate of 1.54 seconds per slice.Brain tumors can appear anywhere in the brain and have vastly different sizes and morphology. Additionally, these tumors are often diffused and poorly contrasted. Consequently, the segmentation of brain tumor and intratumor subregions using magnetic resonance imaging (MRI) data with minimal human interventions remains a challenging task. In this paper, we present a novel fully automatic segmentation method from MRI data containing in vivo brain gliomas. This approach can not only localize the entire tumor region but can also accurately segment the intratumor structure. The proposed work was based on a cascaded deep learning convolutional neural network consisting of two subnetworks: (1) a tumor localization network (TLN) and (2) an intratumor classification network (ITCN). The TLN, a fully convolutional network (FCN) in conjunction with the transfer learning technology, was used to first process MRI data. The goal of the first subnetwork was to define the tumor region from an MRI slice. Then, the ITCN was used to label the defined tumor region into multiple subregions. Particularly, ITCN exploited a convolutional neural network (CNN) with deeper architecture and smaller kernel. The proposed approach was validated on multimodal brain tumor segmentation (BRATS 2015) datasets, which contain 220 high-grade glioma (HGG) and 54 low-grade glioma (LGG) cases. Dice similarity coefficient (DSC), positive predictive value (PPV), and sensitivity were used as evaluation metrics. Our experimental results indicated that our method could obtain the promising segmentation results and had a faster segmentation speed. More specifically, the proposed method obtained comparable and overall better DSC values (0.89, 0.77, and 0.80) on the combined (HGG + LGG) testing set, as compared to other methods reported in the literature. Additionally, the proposed approach was able to complete a segmentation task at a rate of 1.54 seconds per slice.
Audience Academic
Author Jiang, Jingfeng
Mao, Lei
Cui, Shaoguo
Xiong, Shuyu
Liu, Chang
AuthorAffiliation 1 College of Computer Science and Engineering, Chongqing University of Technology, Chongqing 400054, China
2 Medical Physics Department, University of Wisconsin, Madison, WI 53705, USA
3 Biomedical Engineering Department, Michigan Technological University, Houghton, MI 49931, USA
AuthorAffiliation_xml – name: 1 College of Computer Science and Engineering, Chongqing University of Technology, Chongqing 400054, China
– name: 3 Biomedical Engineering Department, Michigan Technological University, Houghton, MI 49931, USA
– name: 2 Medical Physics Department, University of Wisconsin, Madison, WI 53705, USA
Author_xml – sequence: 1
  fullname: Liu, Chang
– sequence: 2
  fullname: Jiang, Jingfeng
– sequence: 3
  fullname: Mao, Lei
– sequence: 4
  fullname: Cui, Shaoguo
– sequence: 5
  fullname: Xiong, Shuyu
BackLink https://www.ncbi.nlm.nih.gov/pubmed/29755716$$D View this record in MEDLINE/PubMed
BookMark eNqFkttrFDEUxoO02Fr75rMEfBF0bS6T24uwrrYuVAW1z-FsJrONziTrZMbif2-G2dYLiHk5h-SX73zhywN0EFP0CD2i5AWlQpwxQvVZZSoiDL-HjhmpyIJxYg5ue2bEETrN-QspixteUX4fHTGjhFBUHiO_HIfUwRAc_uQ7iHOz7XwcymaKODX4VQ8h4os2FDDjpk8dfvdxjdcdbH3GVznELQb82vsdXkF2UPsav_djD20pw03qvz5Ehw202Z_u6wm6On_zefV2cfnhYr1aXi5cpeiwkF5JDozVlXRaUK2JkqKqldZOKkI9GCXZhgCrqa-VEpJ6Q2DDRG04aYDxE_Ry1t2Nm87XrryiuLC7PnTQ_7AJgv3zJIZru03frdC6YnQSeLoX6NO30efBdiE737YQfRqzZYRrRQSTpqBPZnQLrbchNqkougm3S6En49rIQj3-3dGdldsECvB8Blyfcu59c4dQYqeM7ZSx3WdccPYX7sKcVJke2n9dejZfug6xhpvwvxF7y-U_lAHwi6ZacWb4T_JZvVg
CitedBy_id crossref_primary_10_1016_j_bspc_2024_106054
crossref_primary_10_1051_e3sconf_202449104002
crossref_primary_10_1049_iet_ipr_2018_6682
crossref_primary_10_3389_fncom_2020_00009
crossref_primary_10_1109_ACCESS_2020_2983075
crossref_primary_10_1038_s41598_024_51472_2
crossref_primary_10_1002_ima_22331
crossref_primary_10_1007_s11042_023_16458_8
crossref_primary_10_1016_j_bspc_2023_105886
crossref_primary_10_1016_j_cpcardiol_2023_102129
crossref_primary_10_3390_s23187816
crossref_primary_10_1007_s12553_020_00514_6
crossref_primary_10_1002_btm2_10553
crossref_primary_10_7717_peerj_cs_1878
crossref_primary_10_1049_cit2_12270
crossref_primary_10_1109_ACCESS_2020_2989819
crossref_primary_10_1007_s11042_021_11822_y
crossref_primary_10_1016_j_compmedimag_2023_102313
crossref_primary_10_1063_5_0239433
crossref_primary_10_1007_s00521_022_07230_4
crossref_primary_10_1007_s12194_021_00633_3
crossref_primary_10_3171_2021_5_FOCUS21200
crossref_primary_10_1016_j_ymeth_2020_10_004
crossref_primary_10_1002_jmri_27105
crossref_primary_10_1109_ACCESS_2024_3480271
crossref_primary_10_1093_neuros_nyaa282
crossref_primary_10_1016_j_compbiomed_2024_108635
crossref_primary_10_1038_s41598_024_55864_2
crossref_primary_10_1109_ACCESS_2021_3132050
crossref_primary_10_1016_j_bspc_2023_104586
crossref_primary_10_1155_2022_8619690
crossref_primary_10_3389_fonc_2022_873268
crossref_primary_10_3390_jimaging7040066
crossref_primary_10_1016_j_compbiomed_2022_106426
crossref_primary_10_1007_s00521_021_06134_z
crossref_primary_10_1016_j_compbiomed_2019_03_014
crossref_primary_10_1007_s11042_019_08048_4
crossref_primary_10_1177_0161734620902527
crossref_primary_10_1016_j_bbe_2020_07_001
crossref_primary_10_1038_s41598_019_57242_9
crossref_primary_10_1016_j_compmedimag_2020_101811
crossref_primary_10_1155_2021_6695108
crossref_primary_10_1016_j_neurad_2023_05_008
crossref_primary_10_3390_medicina61040561
crossref_primary_10_1109_JBHI_2023_3346529
crossref_primary_10_3390_healthcare10122340
crossref_primary_10_1002_mp_17260
crossref_primary_10_1007_s11761_024_00418_7
crossref_primary_10_1007_s11768_024_00231_9
crossref_primary_10_21015_vtse_v11i2_1533
crossref_primary_10_1088_1361_6560_ac7d33
crossref_primary_10_1155_2020_9258649
crossref_primary_10_1007_s11831_024_10128_0
crossref_primary_10_1016_j_inffus_2022_12_010
crossref_primary_10_1016_j_bspc_2019_101641
crossref_primary_10_1002_ima_22677
crossref_primary_10_1007_s42979_021_00784_5
crossref_primary_10_1016_j_procs_2023_01_053
crossref_primary_10_3390_jimaging7020019
crossref_primary_10_1016_j_zemedi_2018_11_002
crossref_primary_10_1007_s10143_023_02014_3
crossref_primary_10_1007_s11042_023_18017_7
crossref_primary_10_1007_s11548_021_02326_z
crossref_primary_10_1002_acm2_13321
crossref_primary_10_1007_s11042_024_18996_1
crossref_primary_10_1016_j_renene_2021_06_086
crossref_primary_10_1109_TCBB_2019_2939522
crossref_primary_10_1109_ACCESS_2024_3450593
crossref_primary_10_1155_2020_2785464
crossref_primary_10_1007_s11831_023_09898_w
crossref_primary_10_1016_j_ymeth_2020_09_007
crossref_primary_10_1007_s10916_019_1368_4
crossref_primary_10_1002_mp_17171
crossref_primary_10_1136_bmjopen_2020_042660
crossref_primary_10_1016_j_knosys_2023_110544
crossref_primary_10_1007_s10278_021_00516_4
crossref_primary_10_1016_j_jocn_2021_04_043
crossref_primary_10_1186_s12880_024_01283_8
crossref_primary_10_3390_diagnostics13172770
crossref_primary_10_1088_2632_2153_ad8e2c
crossref_primary_10_1007_s12652_022_03773_5
crossref_primary_10_1109_ACCESS_2021_3107303
crossref_primary_10_1016_j_bspc_2022_104017
crossref_primary_10_1016_j_ejmp_2022_06_007
crossref_primary_10_3390_cancers14051349
crossref_primary_10_1109_TIP_2021_3070752
crossref_primary_10_1177_20552076221074122
crossref_primary_10_3389_fonc_2023_1325179
crossref_primary_10_1007_s11548_024_03205_z
crossref_primary_10_1016_j_compbiomed_2022_105273
crossref_primary_10_3389_fonc_2019_00768
crossref_primary_10_4274_atfm_galenos_2022_97830
crossref_primary_10_13005_bpj_2576
Cites_doi 10.1109/TMI.2014.2377694
10.1007/s00401-010-0750-6
10.1109/TPAMI.2016.2572683
10.1007/s12021-014-9245-2
10.1016/j.media.2016.05.004
10.3322/caac.20069
10.1016/j.media.2016.10.004
10.1007/s10044-017-0597-8
10.1088/0031-9155/58/13/R97
10.1109/TMI.2016.2535302
10.1007/978-3-642-35289-8_26
10.1109/TMI.2016.2538465
10.1016/j.compbiomed.2013.07.001
10.1007/978-3-319-30858-6_12
10.1109/TMI.2016.2528162
10.2307/1932409
10.1007/978-3-319-60964-5_44
10.1007/978-3-642-23626-6_44
10.1007/978-3-319-19665-7_17
10.1007/s11548-015-1311-1
10.1109/TMI.2011.2181857
10.1148/radiol.09090663
10.1007/s00401-007-0243-4
10.1109/TMI.2007.912817
10.1007/978-3-642-33454-2_46
ContentType Journal Article
Copyright Copyright © 2018 Shaoguo Cui et al.
COPYRIGHT 2018 John Wiley & Sons, Inc.
Copyright © 2018 Shaoguo Cui et al. 2018
Copyright_xml – notice: Copyright © 2018 Shaoguo Cui et al.
– notice: COPYRIGHT 2018 John Wiley & Sons, Inc.
– notice: Copyright © 2018 Shaoguo Cui et al. 2018
DBID ADJCN
AHFXO
RHU
RHW
RHX
AAYXX
CITATION
NPM
7X8
5PM
DOI 10.1155/2018/4940593
DatabaseName الدوريات العلمية والإحصائية - e-Marefa Academic and Statistical Periodicals
معرفة - المحتوى العربي الأكاديمي المتكامل - e-Marefa Academic Complete
Hindawi Publishing Complete
Hindawi Publishing Subscription Journals
Hindawi Publishing Open Access
CrossRef
PubMed
MEDLINE - Academic
PubMed Central (Full Participant titles)
DatabaseTitle CrossRef
PubMed
MEDLINE - Academic
DatabaseTitleList
CrossRef
PubMed
MEDLINE - Academic


Database_xml – sequence: 1
  dbid: RHX
  name: Hindawi Publishing Open Access
  url: http://www.hindawi.com/journals/
  sourceTypes: Publisher
– sequence: 2
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2040-2309
Editor Chang, Weide
Editor_xml – sequence: 1
  givenname: Weide
  surname: Chang
  fullname: Chang, Weide
EndPage 14
ExternalDocumentID PMC5884212
A587654896
29755716
10_1155_2018_4940593
1187329
Genre Research Support, Non-U.S. Gov't
Journal Article
GrantInformation_xml – fundername: Chongqing University of Technology
  grantid: YCX2016230
– fundername: Chongqing Municipal Education Commission
  grantid: KJ1709210; 16SKGH133
– fundername: Chongqing Science and Technology Commission
  grantid: cstc2016jcyjA0383
GroupedDBID 0R~
24P
4.4
53G
5VS
AAFWJ
AAMMB
ACCMX
ADBBV
ADJCN
ADRAZ
AEFGJ
AENEX
AGXDD
AHFXO
AIDQK
AIDYY
ALMA_UNASSIGNED_HOLDINGS
AOIJS
BCNDV
EBD
EBS
EJD
EMOBN
H13
HYE
IHR
INR
IPNFZ
KQ8
M48
MET
MV1
OK1
P2P
PGMZT
RIG
RPM
SV3
AAJEY
GROUPED_DOAJ
IAO
IEA
INH
ITC
RHU
RHW
RHX
AAYXX
ALUQN
CITATION
NPM
7X8
5PM
ID FETCH-LOGICAL-c471t-6e763a22d46c8518807654d788c6701ea9762b0a2d1ed77561e90ab25d930fa23
IEDL.DBID RHX
ISICitedReferencesCount 108
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000428860500001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 2040-2295
IngestDate Tue Nov 04 01:51:26 EST 2025
Wed Oct 01 13:30:51 EDT 2025
Tue Nov 11 10:55:36 EST 2025
Wed Feb 19 02:35:05 EST 2025
Sat Nov 29 04:52:05 EST 2025
Tue Nov 18 22:32:47 EST 2025
Sun Jun 02 18:54:28 EDT 2024
Thu Sep 25 15:12:52 EDT 2025
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 2018
Language English
License This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
http://creativecommons.org/licenses/by/4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c471t-6e763a22d46c8518807654d788c6701ea9762b0a2d1ed77561e90ab25d930fa23
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Academic Editor: Weide Chang
ORCID 0000-0001-8812-6246
0000-0003-3064-0490
OpenAccessLink https://dx.doi.org/10.1155/2018/4940593
PMID 29755716
PQID 2038705269
PQPubID 23479
PageCount 14
ParticipantIDs pubmedcentral_primary_oai_pubmedcentral_nih_gov_5884212
proquest_miscellaneous_2038705269
gale_infotracmisc_A587654896
pubmed_primary_29755716
crossref_primary_10_1155_2018_4940593
crossref_citationtrail_10_1155_2018_4940593
hindawi_primary_10_1155_2018_4940593
emarefa_primary_1187329
PublicationCentury 2000
PublicationDate 2018-01-01
PublicationDateYYYYMMDD 2018-01-01
PublicationDate_xml – month: 01
  year: 2018
  text: 2018-01-01
  day: 01
PublicationDecade 2010
PublicationPlace Cairo, Egypt
PublicationPlace_xml – name: Cairo, Egypt
– name: England
PublicationTitle Journal of healthcare engineering
PublicationTitleAlternate J Healthc Eng
PublicationYear 2018
Publisher Hindawi Publishing Corporation
Hindawi
John Wiley & Sons, Inc
Publisher_xml – name: Hindawi Publishing Corporation
– name: Hindawi
– name: John Wiley & Sons, Inc
References 22
(23) 2012
28
29
(31) 2014; 15
(17) 2015
10
32
11
33
34
13
36
38
18
19
(16) 2014
1
2
3
4
5
6
7
8
9
40
20
21
(25) 2009; 2
References_xml – ident: 32
  doi: 10.1109/TMI.2014.2377694
– ident: 4
  doi: 10.1007/s00401-010-0750-6
– ident: 22
  doi: 10.1109/TPAMI.2016.2572683
– ident: 10
  doi: 10.1007/s12021-014-9245-2
– volume: 15
  start-page: 1929
  issue: 1
  year: 2014
  ident: 31
  publication-title: Journal of Machine Learning Research
– start-page: 13
  volume-title: Structured prediction with convolutional neural networks for multimodal brain tumor segmentation
  year: 2015
  ident: 17
– ident: 19
  doi: 10.1016/j.media.2016.05.004
– ident: 3
  doi: 10.3322/caac.20069
– ident: 21
  doi: 10.1016/j.media.2016.10.004
– volume: 2
  start-page: 1
  year: 2009
  ident: 25
  publication-title: Insight Journal
– ident: 9
  doi: 10.1007/s10044-017-0597-8
– ident: 2
  doi: 10.1088/0031-9155/58/13/R97
– ident: 29
  doi: 10.1109/TMI.2016.2535302
– ident: 33
  doi: 10.1007/978-3-642-35289-8_26
– ident: 18
  doi: 10.1109/TMI.2016.2538465
– ident: 7
  doi: 10.1016/j.compbiomed.2013.07.001
– ident: 38
  doi: 10.1007/978-3-319-30858-6_12
– ident: 28
  doi: 10.1109/TMI.2016.2528162
– volume-title: Brain tumor segmentation with deep neural networks
  year: 2014
  ident: 16
– ident: 34
  doi: 10.2307/1932409
– ident: 36
  doi: 10.1007/978-3-319-60964-5_44
– ident: 13
  doi: 10.1007/978-3-642-23626-6_44
– ident: 20
  doi: 10.1007/978-3-319-19665-7_17
– volume-title: Imagenet classification with deep convolutional neural networks
  year: 2012
  ident: 23
– ident: 8
  doi: 10.1007/s11548-015-1311-1
– ident: 6
  doi: 10.1109/TMI.2011.2181857
– ident: 40
  doi: 10.1148/radiol.09090663
– ident: 1
  doi: 10.1007/s00401-007-0243-4
– ident: 5
  doi: 10.1109/TMI.2007.912817
– ident: 11
  doi: 10.1007/978-3-642-33454-2_46
SSID ssj0000393413
Score 2.4948835
Snippet Brain tumors can appear anywhere in the brain and have vastly different sizes and morphology. Additionally, these tumors are often diffused and poorly...
SourceID pubmedcentral
proquest
gale
pubmed
crossref
hindawi
emarefa
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 1
SubjectTerms Brain tumors
Gliomas
Magnetic resonance imaging
Medical imaging equipment
Neural networks
Title Automatic Semantic Segmentation of Brain Gliomas from MRI Images Using a Deep Cascaded Neural Network
URI https://search.emarefa.net/detail/BIM-1187329
https://dx.doi.org/10.1155/2018/4940593
https://www.ncbi.nlm.nih.gov/pubmed/29755716
https://www.proquest.com/docview/2038705269
https://pubmed.ncbi.nlm.nih.gov/PMC5884212
Volume 2018
WOSCitedRecordID wos000428860500001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVWIB
  databaseName: Open Access: Wiley-Blackwell Open Access Journals
  customDbUrl:
  eissn: 2040-2309
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000393413
  issn: 2040-2295
  databaseCode: 24P
  dateStart: 20100101
  isFulltext: true
  titleUrlDefault: https://authorservices.wiley.com/open-science/open-access/browse-journals.html
  providerName: Wiley-Blackwell
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3dT6QwEJ-4RpPzwaj3IX6lJt7ThRyUltLH9dvkbmO8j-wbKbR7buKyRnb133cGWM7VM94LgTC0wK_tzMDMbwAOAmsMagnja8OlL0Ib-VrnAz_LsigYOBfZKpH29zfV6yX9vr5sSJLKl7_wUduhex4mX4UWVHyuA51E0uC9Ou-3n1IovVRUhZA5xcdRgepZiPuzy-eUz7IbGdwx7WK8fE1u8MPwX8bm85jJJ0rodA1WG-uRdWu412HBFRuw8oRT8D247nQyrnhY2Q_suqh3_oyaHKOCjQfskOpCsLObIQUHMcowYd-vLtjFCBeXklVRBMywY-du2ZEpKYLeMmLxwK57ddj4B_h1evLz6Nxvain4OaqfiR87XEgM51bEeVKRsKlYCosOcB6rIHQGzRKeBYbb0Fml0KpyOjAZl1YjZoZHH2GxGBduE5iTOnBoiHE0FoUJrQ5FpoREWLMBNms8-DJ7yWneEI1TvYubtHI4pEwJkrSBxIPPrfRtTbDxitynBq-_YmGiIq492CH8UpqR2E-O8yNPuzKhB0x07MFBg-sb7e_PQE-pBYo9K9x4WqJUhIsaVWLHe6gHQdsS5SVLdDk9UHPDoxUg9u75M8XwumLxpgxhtBu2_u_2tuEdHdYff3ZgcXI3dbuwlN9PhuXdHnS4uMSt6id71dx4BHFJAW8
linkProvider Hindawi Publishing
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Automatic+Semantic+Segmentation+of+Brain+Gliomas+from+MRI+Images+Using+a+Deep+Cascaded+Neural+Network&rft.jtitle=Journal+of+healthcare+engineering&rft.au=Cui%2C+Shaoguo&rft.au=Mao%2C+Lei&rft.au=Jiang%2C+Jingfeng&rft.au=Liu%2C+Chang&rft.date=2018-01-01&rft.issn=2040-2295&rft.eissn=2040-2309&rft.volume=2018&rft.spage=1&rft.epage=14&rft_id=info:doi/10.1155%2F2018%2F4940593&rft.externalDBID=n%2Fa&rft.externalDocID=10_1155_2018_4940593
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2040-2295&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2040-2295&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2040-2295&client=summon