ROI-Based Multimodal Neuroimaging Feature Fusion Method and Its Graph Neural Network Diagnostic Model

Single-modality neuroimaging data often provide limited information and are constrained by technical issues such as signal-to-noise ratio, and resolution limitations, potentially leading to biases and an incomplete understanding of brain complexities. This can hinder the development of diagnostic an...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE access Ročník 13; s. 26915 - 26926
Hlavní autoři: Wang, Xuan, Yang, Xiaopeng, Zhang, Xiaotong, Chen, Yang
Médium: Journal Article
Jazyk:angličtina
Vydáno: Piscataway IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Témata:
ISSN:2169-3536, 2169-3536
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Single-modality neuroimaging data often provide limited information and are constrained by technical issues such as signal-to-noise ratio, and resolution limitations, potentially leading to biases and an incomplete understanding of brain complexities. This can hinder the development of diagnostic and therapeutic strategies for brain disorders. To address these challenges, this paper presents the Multimodal Graph Neural Network Model based on Feature Fusion (MMP-DGNN), which leverages sMRI and PET data. The model employs an algorithm to extract and accurately describe sample features using an autoencoder. During feature fusion, a shared adjacency matrix based on feature similarity and phenotypic data is constructed for graph representation. A dual-layer graph neural network then classifies the features, with the results fused at the decision layer for final classification. Experimental results show that MMP-DGNN achieves superior classification performance of 98.17%, outperforming other methods in multimodal neuroimaging data classification.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3435433