Colorectal Cancer Tissue Classification Using Semi-Supervised Hypergraph Convolutional Network

Colorectal Cancer (CRC) is a leading cause of death around the globe, and therefore, the analysis of tumor micro environment in the CRC WSIs is important for the early detection of CRC. Conventional visual inspection is very time consuming and the process can undergo inaccuracies because of the subj...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Proceedings (International Symposium on Biomedical Imaging) s. 1306 - 1309
Hlavní autori: Bakht, Ahsan Baidar, Javed, Sajid, AlMarzouqi, Hasan, Khandoker, Ahsan, Werghi, Naoufel
Médium: Konferenčný príspevok..
Jazyk:English
Vydavateľské údaje: IEEE 13.04.2021
Predmet:
ISSN:1945-8452
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Colorectal Cancer (CRC) is a leading cause of death around the globe, and therefore, the analysis of tumor micro environment in the CRC WSIs is important for the early detection of CRC. Conventional visual inspection is very time consuming and the process can undergo inaccuracies because of the subject-level assessment. Deep learning has shown promising results in medical image analysis. However, these approaches require a lot of labeling images from medical experts. In this paper, we propose a semi-supervised algorithm for CRC tissue classification. We propose to employ the hypergraph neural network to classify seven different biologically meaningful CRC tissue types. Firstly, image deep features are extracted from input patches using the pre-trained VGG19 model. The hypergraph is then constructed whereby patch-level deep features represent the vertices of hypergraph and hyperedges are assigned using pair-wise euclidean distance. The edges, vertices, and their corresponding patch-level features are passed through a feed-forward neural network to perform tissue classification in a transductive manner. Experiments are performed on an independent CRC tissue classification dataset and compared with existing state-of-the-art methods. Our results reveal that the proposed algorithm outperforms existing methods by achieving an overall accuracy of 95.46% and AvTP of 94.42%.
ISSN:1945-8452
DOI:10.1109/ISBI48211.2021.9434036