Iterative Convolutional Encoder-Decoder Network with Multi-Scale Context Learning for Liver Segmentation

Rapid and accurate extraction of liver tissue from abdominal computed tomography (CT) and magnetic resonance (MR) images has critical importance for diagnosis and treatment of hepatic diseases. Due to adjacent organs with similar intensities and anatomical variations between different subjects, the...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Applied artificial intelligence Ročník 36; číslo 1
Hlavní autori: Zhang, Feiyan, Yan, Shuhao, Zhao, Yizhong, Gao, Yuan, Li, Zhi, Lu, Xuesong
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Philadelphia Taylor & Francis 31.12.2022
Taylor & Francis Ltd
Taylor & Francis Group
Predmet:
ISSN:0883-9514, 1087-6545
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Rapid and accurate extraction of liver tissue from abdominal computed tomography (CT) and magnetic resonance (MR) images has critical importance for diagnosis and treatment of hepatic diseases. Due to adjacent organs with similar intensities and anatomical variations between different subjects, the performance of segmentation approaches based on deep learning still has room for improvement. In this study, a novel convolutional encoder-decoder network incorporating multi-scale context information is proposed. The probabilistic map from previous classifier is iteratively fed into the encoder layers, which fuses high-level shape context with low-level appearance features in a multi-scale manner. The dense connectivity is adopted to aggregate feature maps of varying scales from the encoder and decoder. We evaluated the proposed method with 2D and 3D application on abdominal CT and MR images of three public datasets. The proposed method generated liver segmentation with significantly higher accuracy (p <0.05), in comparison to several competing methods. These promising results suggest that the novel model could offer high potential for clinical workflow.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0883-9514
1087-6545
DOI:10.1080/08839514.2022.2151186