• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

两级图神经网络

Two-Level Graph Neural Network.

作者信息

Ai Xing, Sun Chengyu, Zhang Zhihong, Hancock Edwin R

出版信息

IEEE Trans Neural Netw Learn Syst. 2024 Apr;35(4):4593-4606. doi: 10.1109/TNNLS.2022.3144343. Epub 2024 Apr 4.

DOI:10.1109/TNNLS.2022.3144343
PMID:35167481
Abstract

Graph neural networks (GNNs) are recently proposed neural network structures for the processing of graph-structured data. Due to their employed neighbor aggregation strategy, existing GNNs focus on capturing node-level information and neglect high-level information. Existing GNNs, therefore, suffer from representational limitations caused by the local permutation invariance (LPI) problem. To overcome these limitations and enrich the features captured by GNNs, we propose a novel GNN framework, referred to as the two-level GNN (TL-GNN). This merges subgraph-level information with node-level information. Moreover, we provide a mathematical analysis of the LPI problem, which demonstrates that subgraph-level information is beneficial to overcoming the problems associated with LPI. A subgraph counting method based on the dynamic programming algorithm is also proposed, and this has the time complexity of O(n) , where n is the number of nodes of a graph. Experiments show that TL-GNN outperforms existing GNNs and achieves state-of-the-art performance.

摘要

图神经网络(GNNs)是最近提出的用于处理图结构数据的神经网络结构。由于其采用的邻居聚合策略,现有的GNNs专注于捕获节点级信息而忽略了高级信息。因此,现有的GNNs受到由局部排列不变性(LPI)问题导致的表示限制。为了克服这些限制并丰富GNNs捕获的特征,我们提出了一种新颖的GNN框架,称为两级GNN(TL-GNN)。它将子图级信息与节点级信息合并。此外,我们对LPI问题进行了数学分析,结果表明子图级信息有助于克服与LPI相关的问题。还提出了一种基于动态规划算法的子图计数方法,其时间复杂度为O(n),其中n是图的节点数。实验表明,TL-GNN优于现有的GNNs,并实现了当前的最优性能。

相似文献

1
Two-Level Graph Neural Network.两级图神经网络
IEEE Trans Neural Netw Learn Syst. 2024 Apr;35(4):4593-4606. doi: 10.1109/TNNLS.2022.3144343. Epub 2024 Apr 4.
2
PSA-GNN: An augmented GNN framework with priori subgraph knowledge.PSA-GNN:基于先验子图知识的增强图神经网络框架。
Neural Netw. 2024 May;173:106155. doi: 10.1016/j.neunet.2024.106155. Epub 2024 Feb 4.
3
CI-GNN: A Granger causality-inspired graph neural network for interpretable brain network-based psychiatric diagnosis.CI-GNN:一种基于 Granger 因果关系的图神经网络,用于可解释的脑网络精神病学诊断。
Neural Netw. 2024 Apr;172:106147. doi: 10.1016/j.neunet.2024.106147. Epub 2024 Jan 26.
4
SP-GNN: Learning structure and position information from graphs.SP-GNN:从图中学习结构和位置信息。
Neural Netw. 2023 Apr;161:505-514. doi: 10.1016/j.neunet.2023.01.051. Epub 2023 Feb 4.
5
Revisiting graph neural networks from hybrid regularized graph signal reconstruction.从混合正则化图信号重构的角度重新审视图神经网络。
Neural Netw. 2023 Jan;157:444-459. doi: 10.1016/j.neunet.2022.11.003. Epub 2022 Nov 12.
6
Co-embedding of edges and nodes with deep graph convolutional neural networks.使用深度图卷积神经网络进行边和节点的联合嵌入
Sci Rep. 2023 Oct 8;13(1):16966. doi: 10.1038/s41598-023-44224-1.
7
Motif Graph Neural Network.基序图神经网络
IEEE Trans Neural Netw Learn Syst. 2024 Oct;35(10):14833-14847. doi: 10.1109/TNNLS.2023.3281716. Epub 2024 Oct 7.
8
An Integrated Fuzzy Neural Network and Topological Data Analysis for Molecular Graph Representation Learning and Property Forecasting.用于分子图表示学习和性质预测的集成模糊神经网络与拓扑数据分析
Mol Inform. 2025 Mar;44(3):e202400335. doi: 10.1002/minf.202400335.
9
Generalizing Graph Neural Networks on Out-of-Distribution Graphs.将图神经网络推广到分布外的图上。
IEEE Trans Pattern Anal Mach Intell. 2024 Jan;46(1):322-337. doi: 10.1109/TPAMI.2023.3321097. Epub 2023 Dec 5.
10
Exploiting Neighbor Effect: Conv-Agnostic GNN Framework for Graphs With Heterophily.利用邻域效应:用于具有异质性的图的卷积不可知图神经网络框架
IEEE Trans Neural Netw Learn Syst. 2024 Oct;35(10):13383-13396. doi: 10.1109/TNNLS.2023.3267902. Epub 2024 Oct 7.