ROI-Based Multimodal Neuroimaging Feature Fusion Method and Its Graph Neural Network Diagnostic Model

Single-modality neuroimaging data often provide limited information and are constrained by technical issues such as signal-to-noise ratio, and resolution limitations, potentially leading to biases and an incomplete understanding of brain complexities. This can hinder the development of diagnostic an...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access Vol. 13; pp. 26915 - 26926
Main Authors: Wang, Xuan, Yang, Xiaopeng, Zhang, Xiaotong, Chen, Yang
Format: Journal Article
Language:English
Published: Piscataway IEEE 2025
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
ISSN:2169-3536, 2169-3536
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Single-modality neuroimaging data often provide limited information and are constrained by technical issues such as signal-to-noise ratio, and resolution limitations, potentially leading to biases and an incomplete understanding of brain complexities. This can hinder the development of diagnostic and therapeutic strategies for brain disorders. To address these challenges, this paper presents the Multimodal Graph Neural Network Model based on Feature Fusion (MMP-DGNN), which leverages sMRI and PET data. The model employs an algorithm to extract and accurately describe sample features using an autoencoder. During feature fusion, a shared adjacency matrix based on feature similarity and phenotypic data is constructed for graph representation. A dual-layer graph neural network then classifies the features, with the results fused at the decision layer for final classification. Experimental results show that MMP-DGNN achieves superior classification performance of 98.17%, outperforming other methods in multimodal neuroimaging data classification.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3435433