Segmentation of parasternal long axis views using deep learning

Accurate segmentation of the parasternal long axis (PLAX) ultrasound view of the heart is essential for automating the many clinical measurements performed in this view. In order to efficiently annotate all the important structures in the PLAX view in a standardized way, a new specialized annotation...

Full description

Saved in:
Bibliographic Details
Published in:IEEE International Ultrasonics Symposium (Online) pp. 1 - 4
Main Authors: Smistad, Erik, Dalen, Havard, Grenne, Bjornar, Lovstakken, Lasse
Format: Conference Proceeding
Language:English
Published: IEEE 10.10.2022
Subjects:
ISSN:1948-5727
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Accurate segmentation of the parasternal long axis (PLAX) ultrasound view of the heart is essential for automating the many clinical measurements performed in this view. In order to efficiently annotate all the important structures in the PLAX view in a standardized way, a new specialized annotation tool was developed. Using this tool, the left ventricle (LV) lumen, myocardium, left atrium (LA), aorta, right ventricle (RV) and left ventricular outflow tract (LVOT) were annotated in images from 53 subjects and used for training a fully-convolutional encoder-decoder neural network. Using cross-validation, the Dice score, mean absolute differ-ence (MAD), and Hausdorff distance were measured for each structure. The results show varying accuracy for the different structures with mean Dice between 0.80 and 0.95, and MAD between 0.9 and 1.8 millimeters. The myocardium and LVOT seems to be most difficult to segment (Dice: 0.80, 0.84, MAD: 1.3, 1.2), while the LV, aorta and RV achieves quite good accuracy (Dice: 0.93-0.95, MAD: 0.9-1.3). Overall, the accuracy of the LV, LA and myocardium seem similar to that achieved in previous studies on images from apical views of the heart. However, more data is needed to increase the robustness of the method and to validate the method further.
ISSN:1948-5727
DOI:10.1109/IUS54386.2022.9957677