Suppr超能文献

IHGNN:用于半监督分类的迭代可解释超图神经网络

IHGNN: Iterative Interpretable HyperGraph Neural Network for semi-supervised classification.

作者信息

Zhang Hongwei, Wang Saizhuo, Hu Zixin, Qi Yuan, Huang Zengfeng, Guo Jian

机构信息

Fudan University, Shanghai, China.

The Hong Kong University of Science and Technology, Hong Kong Special Administrative Region of China, China.

出版信息

Neural Netw. 2025 Mar;183:106929. doi: 10.1016/j.neunet.2024.106929. Epub 2024 Nov 22.

Abstract

Learning on hypergraphs has garnered significant attention recently due to their ability to effectively represent complex higher-order interactions among multiple entities compared to conventional graphs. Nevertheless, the majority of existing methods are direct extensions of graph neural networks, and they exhibit noteworthy limitations. Specifically, most of these approaches primarily rely on either the Laplacian matrix with information distortion or heuristic message passing techniques. The former tends to escalate algorithmic complexity, while the latter lacks a solid theoretical foundation. To address these limitations, we propose a novel hypergraph neural network named IHGNN, which is grounded in an energy minimization function formulated for hypergraphs. Our analysis reveals that propagation layers align well with the message-passing paradigm in the context of hypergraphs. IHGNN achieves a favorable trade-off between performance and interpretability. Furthermore, it effectively balances the significance of node features and hypergraph topology across a diverse range of datasets. We conducted extensive experiments on 15 datasets, and the results highlight the superior performance of IHGNN in the task of hypergraph node classification across nearly all benchmarking datasets.

摘要

近年来,超图学习因其能够比传统图更有效地表示多个实体之间复杂的高阶交互而备受关注。然而,现有的大多数方法都是图神经网络的直接扩展,并且存在明显的局限性。具体而言,这些方法大多主要依赖于存在信息失真的拉普拉斯矩阵或启发式消息传递技术。前者往往会增加算法复杂度,而后者缺乏坚实的理论基础。为了解决这些局限性,我们提出了一种名为IHGNN的新型超图神经网络,它基于为超图制定的能量最小化函数。我们的分析表明,传播层在超图的背景下与消息传递范式非常契合。IHGNN在性能和可解释性之间实现了良好的平衡。此外,它在各种数据集上有效地平衡了节点特征和超图拓扑的重要性。我们在15个数据集上进行了广泛的实验,结果突出了IHGNN在几乎所有基准数据集的超图节点分类任务中的卓越性能。

相似文献

5
HGNN: General Hypergraph Neural Networks.HGNN:广义超图神经网络。
IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):3181-3199. doi: 10.1109/TPAMI.2022.3182052. Epub 2023 Feb 3.
7
Mode Hypergraph Neural Network.模态超图神经网络
IEEE Trans Neural Netw Learn Syst. 2025 Mar 7;PP. doi: 10.1109/TNNLS.2025.3542176.
9
Hypergraph partitioning using tensor eigenvalue decomposition.张量特征值分解的超图划分。
PLoS One. 2023 Jul 21;18(7):e0288457. doi: 10.1371/journal.pone.0288457. eCollection 2023.
10
Adaptive Neural Message Passing for Inductive Learning on Hypergraphs.用于超图归纳学习的自适应神经消息传递
IEEE Trans Pattern Anal Mach Intell. 2025 Jan;47(1):19-31. doi: 10.1109/TPAMI.2024.3434483. Epub 2024 Dec 4.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验