Zhang Hongwei, Wang Saizhuo, Hu Zixin, Qi Yuan, Huang Zengfeng, Guo Jian
Fudan University, Shanghai, China.
The Hong Kong University of Science and Technology, Hong Kong Special Administrative Region of China, China.
Neural Netw. 2025 Mar;183:106929. doi: 10.1016/j.neunet.2024.106929. Epub 2024 Nov 22.
Learning on hypergraphs has garnered significant attention recently due to their ability to effectively represent complex higher-order interactions among multiple entities compared to conventional graphs. Nevertheless, the majority of existing methods are direct extensions of graph neural networks, and they exhibit noteworthy limitations. Specifically, most of these approaches primarily rely on either the Laplacian matrix with information distortion or heuristic message passing techniques. The former tends to escalate algorithmic complexity, while the latter lacks a solid theoretical foundation. To address these limitations, we propose a novel hypergraph neural network named IHGNN, which is grounded in an energy minimization function formulated for hypergraphs. Our analysis reveals that propagation layers align well with the message-passing paradigm in the context of hypergraphs. IHGNN achieves a favorable trade-off between performance and interpretability. Furthermore, it effectively balances the significance of node features and hypergraph topology across a diverse range of datasets. We conducted extensive experiments on 15 datasets, and the results highlight the superior performance of IHGNN in the task of hypergraph node classification across nearly all benchmarking datasets.
近年来,超图学习因其能够比传统图更有效地表示多个实体之间复杂的高阶交互而备受关注。然而,现有的大多数方法都是图神经网络的直接扩展,并且存在明显的局限性。具体而言,这些方法大多主要依赖于存在信息失真的拉普拉斯矩阵或启发式消息传递技术。前者往往会增加算法复杂度,而后者缺乏坚实的理论基础。为了解决这些局限性,我们提出了一种名为IHGNN的新型超图神经网络,它基于为超图制定的能量最小化函数。我们的分析表明,传播层在超图的背景下与消息传递范式非常契合。IHGNN在性能和可解释性之间实现了良好的平衡。此外,它在各种数据集上有效地平衡了节点特征和超图拓扑的重要性。我们在15个数据集上进行了广泛的实验,结果突出了IHGNN在几乎所有基准数据集的超图节点分类任务中的卓越性能。