Machine learning algorithm using publicly available echo database for simplified "visual estimation" of left ventricular ejection fraction

Left ventricular ejection fraction calculation automation typically requires complex algorithms and is dependent of optimal visualization and tracing of endocardial borders. This significantly limits usability in bedside clinical applications, where ultrasound automation is needed most. To create a...

Full description

Saved in:
Bibliographic Details
Published in:World journal of experimental medicine Vol. 12; no. 2; p. 16
Main Authors: Blaivas, Michael, Blaivas, Laura
Format: Journal Article
Language:English
Published: United States 20.03.2022
Subjects:
ISSN:2220-315X, 2220-315X
Online Access:Get more information
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Left ventricular ejection fraction calculation automation typically requires complex algorithms and is dependent of optimal visualization and tracing of endocardial borders. This significantly limits usability in bedside clinical applications, where ultrasound automation is needed most. To create a simple deep learning (DL) regression-type algorithm to visually estimate left ventricular (LV) ejection fraction (EF) from a public database of actual patient echo examinations and compare results to echocardiography laboratory EF calculations. A simple DL architecture previously proven to perform well on ultrasound image analysis, VGG16, was utilized as a base architecture running within a long short term memory algorithm for sequential image (video) analysis. After obtaining permission to use the Stanford EchoNet-Dynamic database, researchers randomly removed approximately 15% of the approximately 10036 echo apical 4-chamber videos for later performance testing. All database echo examinations were read as part of comprehensive echocardiography study performance and were coupled with EF, end systolic and diastolic volumes, key frames and coordinates for LV endocardial tracing in csv file. To better reflect point-of-care ultrasound (POCUS) clinical settings and time pressure, the algorithm was trained on echo video correlated with calculated ejection fraction without incorporating additional volume, measurement and coordinate data. Seventy percent of the original data was used for algorithm training and 15% for validation during training. The previously randomly separated 15% (1263 echo videos) was used for algorithm performance testing after training completion. Given the inherent variability of echo EF measurement and field standards for evaluating algorithm accuracy, mean absolute error (MAE) and root mean square error (RMSE) calculations were made on algorithm EF results compared to Echo Lab calculated EF. Bland-Atlman calculation was also performed. MAE for skilled echocardiographers has been established to range from 4% to 5%. The DL algorithm visually estimated EF had a MAE of 8.08% (95%CI 7.60 to 8.55) suggesting good performance compared to highly skill humans. The RMSE was 11.98 and correlation of 0.348. This experimental simplified DL algorithm showed promise and proved reasonably accurate at visually estimating LV EF from short real time echo video clips. Less burdensome than complex DL approaches used for EF calculation, such an approach may be more optimal for POCUS settings once improved upon by future research and development.
AbstractList Left ventricular ejection fraction calculation automation typically requires complex algorithms and is dependent of optimal visualization and tracing of endocardial borders. This significantly limits usability in bedside clinical applications, where ultrasound automation is needed most. To create a simple deep learning (DL) regression-type algorithm to visually estimate left ventricular (LV) ejection fraction (EF) from a public database of actual patient echo examinations and compare results to echocardiography laboratory EF calculations. A simple DL architecture previously proven to perform well on ultrasound image analysis, VGG16, was utilized as a base architecture running within a long short term memory algorithm for sequential image (video) analysis. After obtaining permission to use the Stanford EchoNet-Dynamic database, researchers randomly removed approximately 15% of the approximately 10036 echo apical 4-chamber videos for later performance testing. All database echo examinations were read as part of comprehensive echocardiography study performance and were coupled with EF, end systolic and diastolic volumes, key frames and coordinates for LV endocardial tracing in csv file. To better reflect point-of-care ultrasound (POCUS) clinical settings and time pressure, the algorithm was trained on echo video correlated with calculated ejection fraction without incorporating additional volume, measurement and coordinate data. Seventy percent of the original data was used for algorithm training and 15% for validation during training. The previously randomly separated 15% (1263 echo videos) was used for algorithm performance testing after training completion. Given the inherent variability of echo EF measurement and field standards for evaluating algorithm accuracy, mean absolute error (MAE) and root mean square error (RMSE) calculations were made on algorithm EF results compared to Echo Lab calculated EF. Bland-Atlman calculation was also performed. MAE for skilled echocardiographers has been established to range from 4% to 5%. The DL algorithm visually estimated EF had a MAE of 8.08% (95%CI 7.60 to 8.55) suggesting good performance compared to highly skill humans. The RMSE was 11.98 and correlation of 0.348. This experimental simplified DL algorithm showed promise and proved reasonably accurate at visually estimating LV EF from short real time echo video clips. Less burdensome than complex DL approaches used for EF calculation, such an approach may be more optimal for POCUS settings once improved upon by future research and development.
Left ventricular ejection fraction calculation automation typically requires complex algorithms and is dependent of optimal visualization and tracing of endocardial borders. This significantly limits usability in bedside clinical applications, where ultrasound automation is needed most.BACKGROUNDLeft ventricular ejection fraction calculation automation typically requires complex algorithms and is dependent of optimal visualization and tracing of endocardial borders. This significantly limits usability in bedside clinical applications, where ultrasound automation is needed most.To create a simple deep learning (DL) regression-type algorithm to visually estimate left ventricular (LV) ejection fraction (EF) from a public database of actual patient echo examinations and compare results to echocardiography laboratory EF calculations.AIMTo create a simple deep learning (DL) regression-type algorithm to visually estimate left ventricular (LV) ejection fraction (EF) from a public database of actual patient echo examinations and compare results to echocardiography laboratory EF calculations.A simple DL architecture previously proven to perform well on ultrasound image analysis, VGG16, was utilized as a base architecture running within a long short term memory algorithm for sequential image (video) analysis. After obtaining permission to use the Stanford EchoNet-Dynamic database, researchers randomly removed approximately 15% of the approximately 10036 echo apical 4-chamber videos for later performance testing. All database echo examinations were read as part of comprehensive echocardiography study performance and were coupled with EF, end systolic and diastolic volumes, key frames and coordinates for LV endocardial tracing in csv file. To better reflect point-of-care ultrasound (POCUS) clinical settings and time pressure, the algorithm was trained on echo video correlated with calculated ejection fraction without incorporating additional volume, measurement and coordinate data. Seventy percent of the original data was used for algorithm training and 15% for validation during training. The previously randomly separated 15% (1263 echo videos) was used for algorithm performance testing after training completion. Given the inherent variability of echo EF measurement and field standards for evaluating algorithm accuracy, mean absolute error (MAE) and root mean square error (RMSE) calculations were made on algorithm EF results compared to Echo Lab calculated EF. Bland-Atlman calculation was also performed. MAE for skilled echocardiographers has been established to range from 4% to 5%.METHODSA simple DL architecture previously proven to perform well on ultrasound image analysis, VGG16, was utilized as a base architecture running within a long short term memory algorithm for sequential image (video) analysis. After obtaining permission to use the Stanford EchoNet-Dynamic database, researchers randomly removed approximately 15% of the approximately 10036 echo apical 4-chamber videos for later performance testing. All database echo examinations were read as part of comprehensive echocardiography study performance and were coupled with EF, end systolic and diastolic volumes, key frames and coordinates for LV endocardial tracing in csv file. To better reflect point-of-care ultrasound (POCUS) clinical settings and time pressure, the algorithm was trained on echo video correlated with calculated ejection fraction without incorporating additional volume, measurement and coordinate data. Seventy percent of the original data was used for algorithm training and 15% for validation during training. The previously randomly separated 15% (1263 echo videos) was used for algorithm performance testing after training completion. Given the inherent variability of echo EF measurement and field standards for evaluating algorithm accuracy, mean absolute error (MAE) and root mean square error (RMSE) calculations were made on algorithm EF results compared to Echo Lab calculated EF. Bland-Atlman calculation was also performed. MAE for skilled echocardiographers has been established to range from 4% to 5%.The DL algorithm visually estimated EF had a MAE of 8.08% (95%CI 7.60 to 8.55) suggesting good performance compared to highly skill humans. The RMSE was 11.98 and correlation of 0.348.RESULTSThe DL algorithm visually estimated EF had a MAE of 8.08% (95%CI 7.60 to 8.55) suggesting good performance compared to highly skill humans. The RMSE was 11.98 and correlation of 0.348.This experimental simplified DL algorithm showed promise and proved reasonably accurate at visually estimating LV EF from short real time echo video clips. Less burdensome than complex DL approaches used for EF calculation, such an approach may be more optimal for POCUS settings once improved upon by future research and development.CONCLUSIONThis experimental simplified DL algorithm showed promise and proved reasonably accurate at visually estimating LV EF from short real time echo video clips. Less burdensome than complex DL approaches used for EF calculation, such an approach may be more optimal for POCUS settings once improved upon by future research and development.
Author Blaivas, Michael
Blaivas, Laura
Author_xml – sequence: 1
  givenname: Michael
  surname: Blaivas
  fullname: Blaivas, Michael
  email: mike@blaivas.org
  organization: Department of Medicine, University of South Carolina School of Medicine, Roswell, GA 30076, United States. mike@blaivas.org
– sequence: 2
  givenname: Laura
  surname: Blaivas
  fullname: Blaivas, Laura
  organization: Department of Environmental Science, Michigan State University, Roswell, Georgia 30076, United States
BackLink https://www.ncbi.nlm.nih.gov/pubmed/35433318$$D View this record in MEDLINE/PubMed
BookMark eNpNUE1PAjEUbAxGELl6NA0nL-D2tV2WoyF-JRgvmngjb7uvUNLdxXYXw1_wVwuKiac3LzOZzMw561R1RYxdimSs1VTefK6pHG8FjB2MRXrCegCQjKTQ751_uMsGMa6TJBECphqyM9aVWkkpRdZjX89oVq4i7glD5aolR7-sg2tWJW_j4d-0uXfG7zhu0XnMPXEyq5oX2GCOkbitA4-u3HhnHRV8uHWxRc8pNq7ExtXVkNd2728bvqWqCc60HgOnNZkDy23AH3DBTi36SIPj7bO3-7vX2eNo_vLwNLudj4xMdDpCSElLbTRQqkxm7ISEIgUwmaKa2kztWSOxAJ1rlSko0IpCqQwRVG4LCX12_eu7CfVHu4-5KF005D1WVLdxAamGRAJkB-nVUdrmJRWLTdhXCrvF337wDTSVeEI
CitedBy_id crossref_primary_10_1016_j_ultrasmedbio_2025_03_015
crossref_primary_10_1109_TIM_2025_3553885
crossref_primary_10_1016_j_ccc_2025_02_008
crossref_primary_10_3390_diagnostics13132155
crossref_primary_10_7717_peerj_cs_2506
crossref_primary_10_3390_biomedicines12102324
crossref_primary_10_1016_j_ejmp_2024_104505
crossref_primary_10_1007_s10278_024_01336_y
ContentType Journal Article
Copyright The Author(s) 2022. Published by Baishideng Publishing Group Inc. All rights reserved.
Copyright_xml – notice: The Author(s) 2022. Published by Baishideng Publishing Group Inc. All rights reserved.
DBID NPM
7X8
DOI 10.5493/wjem.v12.i2.16
DatabaseName PubMed
MEDLINE - Academic
DatabaseTitle PubMed
MEDLINE - Academic
DatabaseTitleList PubMed
MEDLINE - Academic
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: 7X8
  name: MEDLINE - Academic
  url: https://search.proquest.com/medline
  sourceTypes: Aggregation Database
DeliveryMethod no_fulltext_linktorsrc
Discipline Medicine
EISSN 2220-315X
ExternalDocumentID 35433318
Genre Journal Article
GroupedDBID 5VR
8WL
ADBBV
ALMA_UNASSIGNED_HOLDINGS
AOIJS
CCEZO
CHBEP
CIEJG
GX1
HYE
NPM
OK1
RPM
7X8
ID FETCH-LOGICAL-c3056-a26e535c52e64c8cf7e14e42279a49f84e53c3ad25b54842daf1d448aa24bfd32
IEDL.DBID 7X8
ISSN 2220-315X
IngestDate Fri Jul 11 07:17:41 EDT 2025
Mon Jul 21 05:15:12 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed false
IsScholarly true
Issue 2
Keywords Deep learning
Echocardiography
Artificial intelligence
Cardiac
Point-of-care-ultrasound
Ejection fraction
Language English
License The Author(s) 2022. Published by Baishideng Publishing Group Inc. All rights reserved.
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c3056-a26e535c52e64c8cf7e14e42279a49f84e53c3ad25b54842daf1d448aa24bfd32
Notes ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
OpenAccessLink https://doi.org/10.5493/wjem.v12.i2.16
PMID 35433318
PQID 2652032283
PQPubID 23479
ParticipantIDs proquest_miscellaneous_2652032283
pubmed_primary_35433318
PublicationCentury 2000
PublicationDate 2022-Mar-20
20220320
PublicationDateYYYYMMDD 2022-03-20
PublicationDate_xml – month: 03
  year: 2022
  text: 2022-Mar-20
  day: 20
PublicationDecade 2020
PublicationPlace United States
PublicationPlace_xml – name: United States
PublicationTitle World journal of experimental medicine
PublicationTitleAlternate World J Exp Med
PublicationYear 2022
SSID ssj0001129528
Score 2.218489
Snippet Left ventricular ejection fraction calculation automation typically requires complex algorithms and is dependent of optimal visualization and tracing of...
SourceID proquest
pubmed
SourceType Aggregation Database
Index Database
StartPage 16
Title Machine learning algorithm using publicly available echo database for simplified "visual estimation" of left ventricular ejection fraction
URI https://www.ncbi.nlm.nih.gov/pubmed/35433318
https://www.proquest.com/docview/2652032283
Volume 12
hasFullText
inHoldings 1
isFullTextHit
isPrint
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1JS8NAFB7cEC_uS90YxWtqMkuSnkTE4sEWDwq5hcksNaVNtakV_4K_2veSiCdB8JJLyCTMe3nbfO99hFw4EGsWCOMFToWeiJzwMmasFzvBFMsi5ZuKteQ-6vfjJOk8NAW3soFVftvEylCbicYa-SULJZJ9gze8enn1kDUKT1cbCo1FsswhlEFIV5TEPzUWcGayolcFL4jmRib13EZIivjl-9CO2_OAtXPWDsLfI8zK03Q3_vuNm2S9iTHpda0UW2TBFttktdecou-Qz16FoLS0oYwYUDUawDqz5zFFHPyA1sOvRx9UzVU-wu4qasFOUsSTot-jEOrSMkc0uoMYlp7P8_INXokjO-peyHM6cbC-m1EEVFZVRjWldlghvwrqpnVDxS556t4-3tx5DSeDpzHZ8BQLreRSS2ZDoWPtIhsIK3AOoRIdFwu4q7kyTGaQCwlmlAsMpIBKMZE5w9keWSomhT0g1NdSaWxRYYILA3JyXMnQdDKRMT_MXIucfe9yCjqPBxmqsJO3Mv3Z5xbZr0WVvtTDOVIuBedgqA7_8PQRWWPYzeBzMBbHZNnBH29PyIqez_JyelopE1z7D70v5wDXQw
linkProvider ProQuest
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Machine+learning+algorithm+using+publicly+available+echo+database+for+simplified+%22visual+estimation%22+of+left+ventricular+ejection+fraction&rft.jtitle=World+journal+of+experimental+medicine&rft.au=Blaivas%2C+Michael&rft.au=Blaivas%2C+Laura&rft.date=2022-03-20&rft.issn=2220-315X&rft.eissn=2220-315X&rft.volume=12&rft.issue=2&rft.spage=16&rft_id=info:doi/10.5493%2Fwjem.v12.i2.16&rft_id=info%3Apmid%2F35433318&rft_id=info%3Apmid%2F35433318&rft.externalDocID=35433318
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2220-315X&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2220-315X&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2220-315X&client=summon