Effect of dataset size, image quality, and image type on deep learning-based automatic prostate segmentation in 3D ultrasound

Three-dimensional (3D) transrectal ultrasound (TRUS) is utilized in prostate cancer diagnosis and treatment, necessitating time-consuming manual prostate segmentation. We have previously developed an automatic 3D prostate segmentation algorithm involving deep learning prediction on radially sampled...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Physics in medicine & biology Ročník 67; číslo 7
Hlavní autoři: Orlando, Nathan, Gyacskov, Igor, Gillies, Derek J, Guo, Fumin, Romagnoli, Cesare, D’Souza, David, Cool, Derek W, Hoover, Douglas A, Fenster, Aaron
Médium: Journal Article
Jazyk:angličtina
Vydáno: England IOP Publishing 07.04.2022
Témata:
ISSN:1361-6560, 1361-6560
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Abstract Three-dimensional (3D) transrectal ultrasound (TRUS) is utilized in prostate cancer diagnosis and treatment, necessitating time-consuming manual prostate segmentation. We have previously developed an automatic 3D prostate segmentation algorithm involving deep learning prediction on radially sampled 2D images followed by 3D reconstruction, trained on a large, clinically diverse dataset with variable image quality. As large clinical datasets are rare, widespread adoption of automatic segmentation could be facilitated with efficient 2D-based approaches and the development of an image quality grading method. The complete training dataset of 6761 2D images, resliced from 206 3D TRUS volumes acquired using end-fire and side-fire acquisition methods, was split to train two separate networks using either end-fire or side-fire images. Split datasets were reduced to 1000, 500, 250, and 100 2D images. For deep learning prediction, modified U-Net and U-Net++ architectures were implemented and compared using an unseen test dataset of 40 3D TRUS volumes. A 3D TRUS image quality grading scale with three factors (acquisition quality, artifact severity, and boundary visibility) was developed to assess the impact on segmentation performance. For the complete training dataset, U-Net and U-Net++ networks demonstrated equivalent performance, but when trained using split end-fire/side-fire datasets, U-Net++ significantly outperformed the U-Net. Compared to the complete training datasets, U-Net++ trained using reduced-size end-fire and side-fire datasets demonstrated equivalent performance down to 500 training images. For this dataset, image quality had no impact on segmentation performance for end-fire images but did have a significant effect for side-fire images, with boundary visibility having the largest impact. Our algorithm provided fast (<1.5 s) and accurate 3D segmentations across clinically diverse images, demonstrating generalizability and efficiency when employed on smaller datasets, supporting the potential for widespread use, even when data is scarce. The development of an image quality grading scale provides a quantitative tool for assessing segmentation performance.
AbstractList Three-dimensional (3D) transrectal ultrasound (TRUS) is utilized in prostate cancer diagnosis and treatment, necessitating time-consuming manual prostate segmentation. We have previously developed an automatic 3D prostate segmentation algorithm involving deep learning prediction on radially sampled 2D images followed by 3D reconstruction, trained on a large, clinically diverse dataset with variable image quality. As large clinical datasets are rare, widespread adoption of automatic segmentation could be facilitated with efficient 2D-based approaches and the development of an image quality grading method. The complete training dataset of 6761 2D images, resliced from 206 3D TRUS volumes acquired using end-fire and side-fire acquisition methods, was split to train two separate networks using either end-fire or side-fire images. Split datasets were reduced to 1000, 500, 250, and 100 2D images. For deep learning prediction, modified U-Net and U-Net++ architectures were implemented and compared using an unseen test dataset of 40 3D TRUS volumes. A 3D TRUS image quality grading scale with three factors (acquisition quality, artifact severity, and boundary visibility) was developed to assess the impact on segmentation performance. For the complete training dataset, U-Net and U-Net++ networks demonstrated equivalent performance, but when trained using split end-fire/side-fire datasets, U-Net++ significantly outperformed the U-Net. Compared to the complete training datasets, U-Net++ trained using reduced-size end-fire and side-fire datasets demonstrated equivalent performance down to 500 training images. For this dataset, image quality had no impact on segmentation performance for end-fire images but did have a significant effect for side-fire images, with boundary visibility having the largest impact. Our algorithm provided fast (<1.5 s) and accurate 3D segmentations across clinically diverse images, demonstrating generalizability and efficiency when employed on smaller datasets, supporting the potential for widespread use, even when data is scarce. The development of an image quality grading scale provides a quantitative tool for assessing segmentation performance.Three-dimensional (3D) transrectal ultrasound (TRUS) is utilized in prostate cancer diagnosis and treatment, necessitating time-consuming manual prostate segmentation. We have previously developed an automatic 3D prostate segmentation algorithm involving deep learning prediction on radially sampled 2D images followed by 3D reconstruction, trained on a large, clinically diverse dataset with variable image quality. As large clinical datasets are rare, widespread adoption of automatic segmentation could be facilitated with efficient 2D-based approaches and the development of an image quality grading method. The complete training dataset of 6761 2D images, resliced from 206 3D TRUS volumes acquired using end-fire and side-fire acquisition methods, was split to train two separate networks using either end-fire or side-fire images. Split datasets were reduced to 1000, 500, 250, and 100 2D images. For deep learning prediction, modified U-Net and U-Net++ architectures were implemented and compared using an unseen test dataset of 40 3D TRUS volumes. A 3D TRUS image quality grading scale with three factors (acquisition quality, artifact severity, and boundary visibility) was developed to assess the impact on segmentation performance. For the complete training dataset, U-Net and U-Net++ networks demonstrated equivalent performance, but when trained using split end-fire/side-fire datasets, U-Net++ significantly outperformed the U-Net. Compared to the complete training datasets, U-Net++ trained using reduced-size end-fire and side-fire datasets demonstrated equivalent performance down to 500 training images. For this dataset, image quality had no impact on segmentation performance for end-fire images but did have a significant effect for side-fire images, with boundary visibility having the largest impact. Our algorithm provided fast (<1.5 s) and accurate 3D segmentations across clinically diverse images, demonstrating generalizability and efficiency when employed on smaller datasets, supporting the potential for widespread use, even when data is scarce. The development of an image quality grading scale provides a quantitative tool for assessing segmentation performance.
Three-dimensional (3D) transrectal ultrasound (TRUS) is utilized in prostate cancer diagnosis and treatment, necessitating time-consuming manual prostate segmentation. We have previously developed an automatic 3D prostate segmentation algorithm involving deep learning prediction on radially sampled 2D images followed by 3D reconstruction, trained on a large, clinically diverse dataset with variable image quality. As large clinical datasets are rare, widespread adoption of automatic segmentation could be facilitated with efficient 2D-based approaches and the development of an image quality grading method. The complete training dataset of 6761 2D images, resliced from 206 3D TRUS volumes acquired using end-fire and side-fire acquisition methods, was split to train two separate networks using either end-fire or side-fire images. Split datasets were reduced to 1000, 500, 250, and 100 2D images. For deep learning prediction, modified U-Net and U-Net++ architectures were implemented and compared using an unseen test dataset of 40 3D TRUS volumes. A 3D TRUS image quality grading scale with three factors (acquisition quality, artifact severity, and boundary visibility) was developed to assess the impact on segmentation performance. For the complete training dataset, U-Net and U-Net++ networks demonstrated equivalent performance, but when trained using split end-fire/side-fire datasets, U-Net++ significantly outperformed the U-Net. Compared to the complete training datasets, U-Net++ trained using reduced-size end-fire and side-fire datasets demonstrated equivalent performance down to 500 training images. For this dataset, image quality had no impact on segmentation performance for end-fire images but did have a significant effect for side-fire images, with boundary visibility having the largest impact. Our algorithm provided fast (<1.5 s) and accurate 3D segmentations across clinically diverse images, demonstrating generalizability and efficiency when employed on smaller datasets, supporting the potential for widespread use, even when data is scarce. The development of an image quality grading scale provides a quantitative tool for assessing segmentation performance.
Author Fenster, Aaron
D’Souza, David
Gillies, Derek J
Cool, Derek W
Guo, Fumin
Orlando, Nathan
Hoover, Douglas A
Romagnoli, Cesare
Gyacskov, Igor
Author_xml – sequence: 1
  givenname: Nathan
  surname: Orlando
  fullname: Orlando, Nathan
  organization: Western University Robarts Research Institute, London, Ontario N6A 3K7, Canada
– sequence: 2
  givenname: Igor
  surname: Gyacskov
  fullname: Gyacskov, Igor
  organization: Western University Robarts Research Institute, London, Ontario N6A 3K7, Canada
– sequence: 3
  givenname: Derek J
  surname: Gillies
  fullname: Gillies, Derek J
  organization: London Health Sciences Centre , London, Ontario N6A 5W9, Canada
– sequence: 4
  givenname: Fumin
  surname: Guo
  fullname: Guo, Fumin
  organization: University of Toronto Department of Medical Biophysics, Toronto, Ontario M4N 3M5, Canada
– sequence: 5
  givenname: Cesare
  surname: Romagnoli
  fullname: Romagnoli, Cesare
  organization: Western University Department of Medical Imaging, London, Ontario N6A 3K7, a Canad
– sequence: 6
  givenname: David
  surname: D’Souza
  fullname: D’Souza, David
  organization: Western University Department of Oncology, London, Ontario N6A 3K7, Canada
– sequence: 7
  givenname: Derek W
  surname: Cool
  fullname: Cool, Derek W
  organization: Western University Department of Medical Imaging, London, Ontario N6A 3K7, a Canad
– sequence: 8
  givenname: Douglas A
  surname: Hoover
  fullname: Hoover, Douglas A
  organization: Western University Department of Oncology, London, Ontario N6A 3K7, Canada
– sequence: 9
  givenname: Aaron
  surname: Fenster
  fullname: Fenster, Aaron
  organization: Western University Department of Oncology, London, Ontario N6A 3K7, Canada
BackLink https://www.ncbi.nlm.nih.gov/pubmed/35240585$$D View this record in MEDLINE/PubMed
BookMark eNptkctP3DAQxq2KqjzaO6fKNzhsih07sTkinpWQeoGzNYnHK6PEDrFz2Er93_FqF9QDp3noN6Nv5jsmByEGJOSUs1-caX3BRcurtmnZBfQNXIov5OijdfBffkiOU3phjHNdy2_kUDS1ZI1ujsi_W-ewzzQ6aiFDwkyT_4sr6kdYI31dYPB5s6IQ7L6VNxPSGKhFnOiAMAcf1lVXRi2FJccRsu_pNMeUISNNuB4xlNSXGR-ouKHLkGdIcQn2O_nqYEj4Yx9PyPPd7dP1Q_X45_739dVj1UvOctVYbVXbNSgRNXets6y3IIAxUdelEiChY9rVKGultNO8Z5dcdayzunMWxQk53-0tsl4XTNmMPvU4DBAwLsnUbfmUVErIgv7co0s3ojXTXM6eN-b9ZQU42wE-TuYlLnMoys00dqZVRhmmJGO1mawr5OoTkjOz9c5szTFbc8zOO_EGT7OMvQ
CODEN PHMBA7
CitedBy_id crossref_primary_10_5534_wjmh_230050
crossref_primary_10_1016_j_ultrasmedbio_2022_12_005
crossref_primary_10_1016_j_brachy_2022_11_011
crossref_primary_10_1186_s41984_023_00213_0
crossref_primary_10_1088_1361_6560_aca069
crossref_primary_10_1002_ail2_101
crossref_primary_10_1007_s10278_023_00783_3
crossref_primary_10_1038_s41598_025_00309_7
crossref_primary_10_1016_j_zemedi_2022_10_005
crossref_primary_10_1007_s11548_025_03400_6
crossref_primary_10_1038_s41598_025_00966_8
crossref_primary_10_3390_jimaging8090252
crossref_primary_10_1016_j_patcog_2023_109925
crossref_primary_10_1088_1361_6560_acf5c5
crossref_primary_10_1016_j_eswa_2024_124279
crossref_primary_10_3390_diagnostics12123197
crossref_primary_10_1016_j_brachy_2023_04_003
crossref_primary_10_1109_TUFFC_2023_3255843
crossref_primary_10_1016_j_ultrasmedbio_2024_10_005
crossref_primary_10_1016_j_ejrad_2023_110928
crossref_primary_10_3390_cancers16193424
crossref_primary_10_1007_s13721_023_00412_7
crossref_primary_10_1016_j_semradonc_2022_06_008
crossref_primary_10_1007_s10489_023_04676_4
crossref_primary_10_3389_fphy_2024_1398393
crossref_primary_10_1016_j_bone_2023_116987
ContentType Journal Article
Copyright 2022 The Author(s). Published on behalf of Institute of Physics and Engineering in Medicine by IOP Publishing Ltd
Creative Commons Attribution license.
Copyright_xml – notice: 2022 The Author(s). Published on behalf of Institute of Physics and Engineering in Medicine by IOP Publishing Ltd
– notice: Creative Commons Attribution license.
DBID O3W
TSCCA
CGR
CUY
CVF
ECM
EIF
NPM
7X8
DOI 10.1088/1361-6560/ac5a93
DatabaseName Institute of Physics Open Access Journal Titles
IOPscience (Open Access)
Medline
MEDLINE
MEDLINE (Ovid)
MEDLINE
MEDLINE
PubMed
MEDLINE - Academic
DatabaseTitle MEDLINE
Medline Complete
MEDLINE with Full Text
PubMed
MEDLINE (Ovid)
MEDLINE - Academic
DatabaseTitleList MEDLINE - Academic
MEDLINE
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: O3W
  name: Institute of Physics Open Access Journal Titles
  url: http://iopscience.iop.org/
  sourceTypes:
    Enrichment Source
    Publisher
– sequence: 3
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Medicine
Biology
Physics
EISSN 1361-6560
ExternalDocumentID 35240585
pmbac5a93
Genre Research Support, Non-U.S. Gov't
Journal Article
GrantInformation_xml – fundername: Ontario Institute for Cancer Research
  grantid: P.IT.033
  funderid: https://doi.org/10.13039/501100004203
– fundername: Canadian Institutes of Health Research
  grantid: 374556
  funderid: https://doi.org/10.13039/501100000024
– fundername: London Regional Cancer Program Catalyst Grant
– fundername: Natural Sciences and Engineering Research Council of Canada
  funderid: https://doi.org/10.13039/501100000038
– fundername: CIHR
GroupedDBID ---
-DZ
-~X
123
1JI
4.4
5B3
5RE
5VS
5ZH
7.M
7.Q
AAGCD
AAJIO
AAJKP
AATNI
ABCXL
ABHWH
ABJNI
ABLJU
ABQJV
ABVAM
ACAFW
ACGFS
ACHIP
AEFHF
AENEX
AFYNE
AKPSB
ALMA_UNASSIGNED_HOLDINGS
AOAED
ASPBG
ATQHT
AVWKF
AZFZN
CBCFC
CEBXE
CJUJL
CRLBU
CS3
DU5
EBS
EDWGO
EJD
EMSAF
EPQRW
EQZZN
F5P
HAK
IHE
IJHAN
IOP
IZVLO
KOT
LAP
M45
N5L
N9A
O3W
P2P
PJBAE
R4D
RIN
RNS
RO9
ROL
RPA
SY9
TN5
TSCCA
UCJ
W28
XPP
CGR
CUY
CVF
ECM
EIF
NPM
7X8
ADEQX
AEINN
ID FETCH-LOGICAL-c410t-5d8d76b5e4ee81f6fd0cda3a003226fd3a4ab08f2e42778f81c0917b0bd8bfde3
IEDL.DBID O3W
ISICitedReferencesCount 28
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000774581100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1361-6560
IngestDate Fri Sep 05 07:58:07 EDT 2025
Wed Apr 16 06:21:20 EDT 2025
Wed Jun 07 11:18:59 EDT 2023
Wed Aug 21 03:34:57 EDT 2024
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 7
Keywords deep learning
image quality
prostate cancer
small dataset
3D ultrasound prostate segmentation
biopsy
brachytherapy
Language English
License Original content from this work may be used under the terms of the Creative Commons Attribution 4.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Creative Commons Attribution license.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c410t-5d8d76b5e4ee81f6fd0cda3a003226fd3a4ab08f2e42778f81c0917b0bd8bfde3
Notes PMB-112471.R1
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
OpenAccessLink https://iopscience.iop.org/article/10.1088/1361-6560/ac5a93
PMID 35240585
PQID 2636147734
PQPubID 23479
PageCount 20
ParticipantIDs proquest_miscellaneous_2636147734
iop_journals_10_1088_1361_6560_ac5a93
pubmed_primary_35240585
PublicationCentury 2000
PublicationDate 2022-04-07
PublicationDateYYYYMMDD 2022-04-07
PublicationDate_xml – month: 04
  year: 2022
  text: 2022-04-07
  day: 07
PublicationDecade 2020
PublicationPlace England
PublicationPlace_xml – name: England
PublicationTitle Physics in medicine & biology
PublicationTitleAbbrev PMB
PublicationTitleAlternate Phys. Med. Biol
PublicationYear 2022
Publisher IOP Publishing
Publisher_xml – name: IOP Publishing
SSID ssj0011824
Score 2.5370753
Snippet Three-dimensional (3D) transrectal ultrasound (TRUS) is utilized in prostate cancer diagnosis and treatment, necessitating time-consuming manual prostate...
SourceID proquest
pubmed
iop
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
SubjectTerms 3D ultrasound prostate segmentation
biopsy
brachytherapy
Deep Learning
Humans
image quality
Male
Pelvis
Prostate - diagnostic imaging
prostate cancer
Prostatic Neoplasms - diagnostic imaging
small dataset
Ultrasonography
Title Effect of dataset size, image quality, and image type on deep learning-based automatic prostate segmentation in 3D ultrasound
URI https://iopscience.iop.org/article/10.1088/1361-6560/ac5a93
https://www.ncbi.nlm.nih.gov/pubmed/35240585
https://www.proquest.com/docview/2636147734
Volume 67
WOSCitedRecordID wos000774581100001&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3da9UwFD9s8wNfps6v63RE0Ld1a5ukSdnTUIcP7roHP-5bSJpkXPC25bZXmOD_7kkTB4KC4EspIWnCSXLOj55zfgfgJeVoFDnHHaBFmbGK15n2uclMgdi0cdKaiaz683sxn8vFor7YgpPrXJiuT6r_CF8jUXAUYQqIk8cFrYoscMYc64brmm7DDSrRjONh_kC_XLsQEDiz5Jf80yi0JTjB33HlZF_O7v7Xyu7BboKV5DR2vQ9brt2DW7HQ5NUe3D5PLnRsnGI-m-EB_IjMxaTzJMSJDm4kw_K7OyTLFSoZEtMtrw6Jbm1qCv9rSdcS61xPUr2JyywYQkv0Zuwm-lfSh0QShLBkcJerlNrUkmVL6Buy-Tqu9RBKOT2ET2dvP75-l6VqDFnDinzMuJVWVIY75pwsfOVt3lhNNaoFhHDeUs20yaUvHSuFkF4WDWIRYXJjpfHW0Uew03atewIER3EZqpxw75mtpbSsrqzwVBeVbQSbwSsUsUq3aVCTo1xKFeSrgnxVlO8MyG_9-pVRlVBCIS5CTa9662fw4tcGK7w4wRuiW9dtBlVW-DUmBMXpHsedV31k-FCIShHISv70HxeyD3fKkBYRInrEM9gZ1xv3HG4238blsD6AbbGQ-JxfnB9MZ_QnBHnl4A
linkProvider IOP Publishing
linkToPdf http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3di9QwEB_01ONe_Dg_bv2MoG_X27ZJmvRRPBfFc70HP-4tJE1yLLht2XaFE_zfnTRREBQE30pomjCTzPzozPwG4Bnl6BQ5Rw3QosxYxetM-9xkpkBs2jhpzURW_elELJfy7Kw-TX1Op1qYrk-m_wgfI1FwFGFKiJPzglZFFjhj5rrhuqbz3vrLcCXwlIRj_Z5-_hVGQPDMUmzyTzPRn-Aif8eWk49Z3Pjv3d2E6wlekhfx9VtwybX7cC02nLzYh913KZSOg1PuZzPchu-RwZh0noR80cGNZFh9c4dktUZjQ2LZ5cUh0a1NQ-G_LelaYp3rSeo7cZ4Fh2iJ3o7dRANL-lBQglCWDO58nUqcWrJqCT0m2y_jRg-hpdMd-Lh49eHl6yx1ZcgaVuRjxq20ojLcMedk4Stv88ZqqtE8IJTzlmqmTS596VgphPSyaBCTCJMbK423jt6FnbZr3QEQnMVl6HbCvWe2ltKyurLCU11UthFsBs9RzCrdqkFNAXMpVZCxCjJWUcYzIL-916-NqoQSCvERWnyFKpjB059KVniBQlREt67bDqqs8GtMCIrL3YvaV31k-lB4pBDQSn7_HzfyBHZPjxfq5M3y7QPYK0OlREjyEQ9hZ9xs3SO42nwdV8Pm8XRMfwCPs-lU
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Effect+of+dataset+size%2C+image+quality%2C+and+image+type+on+deep+learning-based+automatic+prostate+segmentation+in+3D+ultrasound&rft.jtitle=Physics+in+medicine+%26+biology&rft.au=Orlando%2C+Nathan&rft.au=Gyacskov%2C+Igor&rft.au=Gillies%2C+Derek+J&rft.au=Guo%2C+Fumin&rft.date=2022-04-07&rft.pub=IOP+Publishing&rft.eissn=1361-6560&rft.volume=67&rft.issue=7&rft_id=info:doi/10.1088%2F1361-6560%2Fac5a93&rft.externalDocID=pmbac5a93
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1361-6560&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1361-6560&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1361-6560&client=summon