Suppr超能文献

超复数图神经网络:迈向多模态脑网络的深度交叉

Hypercomplex Graph Neural Network: Towards Deep Intersection of Multi-Modal Brain Networks.

作者信息

Yang Yanwu, Ye Chenfei, Cai Guoqing, Song Kunru, Zhang Jintao, Xiang Yang, Ma Ting

出版信息

IEEE J Biomed Health Inform. 2025 May;29(5):3304-3316. doi: 10.1109/JBHI.2024.3490664. Epub 2025 May 6.

Abstract

The multi-modal neuroimage study has provided insights into understanding the heteromodal relationships between brain network organization and behavioral phenotypes. Integrating data from various modalities facilitates the characterization of the interplay among anatomical, functional, and physiological brain alterations or developments. Graph Neural Networks (GNNs) have recently become popular in analyzing and fusing multi-modal, graph-structured brain networks. However, effectively learning complementary representations from other modalities remains a significant challenge due to the sophisticated and heterogeneous inter-modal dependencies. Furthermore, most existing studies often focus on specific modalities (e.g., only fMRI and DTI), which limits their scalability to other types of brain networks. To overcome these limitations, we propose a HyperComplex Graph Neural Network (HC-GNN) that models multi-modal networks as hypercomplex tensor graphs. In our approach, HC-GNN is conceptualized as a dynamic spatial graph, where the attentively learned inter-modal associations are represented as the adjacency matrix. HC-GNN leverages hypercomplex operations for inter-modal intersections through cross-embedding and cross-aggregation, enriching the deep coupling of multi-modal representations. We conduct a statistical analysis on the saliency maps to associate disease biomarkers. Extensive experiments on three datasets demonstrate the superior classification performance of our method and its strong scalability to various types of modalities. Our work presents a powerful paradigm for the study of multi-modal brain networks.

摘要

多模态神经影像研究为理解脑网络组织与行为表型之间的异模态关系提供了见解。整合来自各种模态的数据有助于刻画大脑解剖、功能和生理变化或发育之间的相互作用。图神经网络(GNN)最近在分析和融合多模态、图结构脑网络方面变得很流行。然而,由于复杂且异质的模态间依赖性,从其他模态有效地学习互补表示仍然是一项重大挑战。此外,大多数现有研究通常只关注特定模态(例如,仅功能磁共振成像和扩散张量成像),这限制了它们对其他类型脑网络的可扩展性。为了克服这些限制,我们提出了一种超复图神经网络(HC-GNN),它将多模态网络建模为超复张量图。在我们的方法中,HC-GNN被概念化为一个动态空间图,其中通过注意力学习到的模态间关联被表示为邻接矩阵。HC-GNN通过交叉嵌入和交叉聚合利用超复运算进行模态间交集,丰富了多模态表示的深度耦合。我们对显著性图进行统计分析以关联疾病生物标志物。在三个数据集上进行的大量实验证明了我们方法的卓越分类性能及其对各种类型模态的强大可扩展性。我们的工作为多模态脑网络研究提出了一个强大的范式。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验