Segmentation of parasternal long axis views using deep learning

Accurate segmentation of the parasternal long axis (PLAX) ultrasound view of the heart is essential for automating the many clinical measurements performed in this view. In order to efficiently annotate all the important structures in the PLAX view in a standardized way, a new specialized annotation...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE International Ultrasonics Symposium (Online) s. 1 - 4
Hlavní autoři: Smistad, Erik, Dalen, Havard, Grenne, Bjornar, Lovstakken, Lasse
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 10.10.2022
Témata:
ISSN:1948-5727
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Accurate segmentation of the parasternal long axis (PLAX) ultrasound view of the heart is essential for automating the many clinical measurements performed in this view. In order to efficiently annotate all the important structures in the PLAX view in a standardized way, a new specialized annotation tool was developed. Using this tool, the left ventricle (LV) lumen, myocardium, left atrium (LA), aorta, right ventricle (RV) and left ventricular outflow tract (LVOT) were annotated in images from 53 subjects and used for training a fully-convolutional encoder-decoder neural network. Using cross-validation, the Dice score, mean absolute differ-ence (MAD), and Hausdorff distance were measured for each structure. The results show varying accuracy for the different structures with mean Dice between 0.80 and 0.95, and MAD between 0.9 and 1.8 millimeters. The myocardium and LVOT seems to be most difficult to segment (Dice: 0.80, 0.84, MAD: 1.3, 1.2), while the LV, aorta and RV achieves quite good accuracy (Dice: 0.93-0.95, MAD: 0.9-1.3). Overall, the accuracy of the LV, LA and myocardium seem similar to that achieved in previous studies on images from apical views of the heart. However, more data is needed to increase the robustness of the method and to validate the method further.
ISSN:1948-5727
DOI:10.1109/IUS54386.2022.9957677