• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

轻柔地排列我:学习用于图表示的软排列

Permute Me Softly: Learning Soft Permutations for Graph Representations.

作者信息

Nikolentzos Giannis, Dasoulas George, Vazirgiannis Michalis

出版信息

IEEE Trans Pattern Anal Mach Intell. 2023 Apr;45(4):5087-5098. doi: 10.1109/TPAMI.2022.3188911. Epub 2023 Mar 7.

DOI:10.1109/TPAMI.2022.3188911
PMID:35793300
Abstract

Graph neural networks (GNNs) have recently emerged as a dominant paradigm for machine learning with graphs. Research on GNNs has mainly focused on the family of message passing neural networks (MPNNs). Similar to the Weisfeiler-Leman (WL) test of isomorphism, these models follow an iterative neighborhood aggregation procedure to update vertex representations, and they next compute graph representations by aggregating the representations of the vertices. Although very successful, MPNNs have been studied intensively in the past few years. Thus, there is a need for novel architectures which will allow research in the field to break away from MPNNs. In this paper, we propose a new graph neural network model, so-called π-GNN which learns a "soft" permutation (i. e., doubly stochastic) matrix for each graph, and thus projects all graphs into a common vector space. The learned matrices impose a "soft" ordering on the vertices of the input graphs, and based on this ordering, the adjacency matrices are mapped into vectors. These vectors can be fed into fully-connected or convolutional layers to deal with supervised learning tasks. In case of large graphs, to make the model more efficient in terms of running time and memory, we further relax the doubly stochastic matrices to row stochastic matrices. We empirically evaluate the model on graph classification and graph regression datasets and show that it achieves performance competitive with state-of-the-art models.

摘要

图神经网络(GNNs)最近已成为用于图的机器学习的主导范式。对GNNs的研究主要集中在消息传递神经网络(MPNNs)家族上。与用于同构的Weisfeiler-Leman(WL)测试类似,这些模型遵循迭代邻域聚合过程来更新顶点表示,然后通过聚合顶点表示来计算图表示。尽管非常成功,但MPNNs在过去几年中已得到深入研究。因此,需要新颖的架构,使该领域的研究能够摆脱MPNNs。在本文中,我们提出了一种新的图神经网络模型,即所谓的π-GNN,它为每个图学习一个“软”排列(即双随机)矩阵,从而将所有图投影到一个公共向量空间中。学习到的矩阵在输入图的顶点上施加了一种“软”排序,并基于这种排序,将邻接矩阵映射为向量。这些向量可以输入到全连接层或卷积层以处理监督学习任务。对于大图,为了使模型在运行时间和内存方面更高效,我们进一步将双随机矩阵放宽为行随机矩阵。我们在图分类和图回归数据集上对该模型进行了实证评估,并表明它实现了与最先进模型相竞争的性能。

相似文献

1
Permute Me Softly: Learning Soft Permutations for Graph Representations.轻柔地排列我:学习用于图表示的软排列
IEEE Trans Pattern Anal Mach Intell. 2023 Apr;45(4):5087-5098. doi: 10.1109/TPAMI.2022.3188911. Epub 2023 Mar 7.
2
Augmented Graph Neural Network with hierarchical global-based residual connections.基于层次全局残差连接的增强图神经网络。
Neural Netw. 2022 Jun;150:149-166. doi: 10.1016/j.neunet.2022.03.008. Epub 2022 Mar 10.
3
k-hop graph neural networks.k 跳图神经网络。
Neural Netw. 2020 Oct;130:195-205. doi: 10.1016/j.neunet.2020.07.008. Epub 2020 Jul 10.
4
Hierarchical Representation Learning in Graph Neural Networks With Node Decimation Pooling.基于节点抽取池化的图神经网络分层表示学习
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):2195-2207. doi: 10.1109/TNNLS.2020.3044146. Epub 2022 May 2.
5
Generalization limits of Graph Neural Networks in identity effects learning.图神经网络在身份效应学习中的泛化极限
Neural Netw. 2025 Jan;181:106793. doi: 10.1016/j.neunet.2024.106793. Epub 2024 Oct 10.
6
PSA-GNN: An augmented GNN framework with priori subgraph knowledge.PSA-GNN:基于先验子图知识的增强图神经网络框架。
Neural Netw. 2024 May;173:106155. doi: 10.1016/j.neunet.2024.106155. Epub 2024 Feb 4.
7
Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting.通过子图同构计数提高图神经网络的表现力。
IEEE Trans Pattern Anal Mach Intell. 2023 Jan;45(1):657-668. doi: 10.1109/TPAMI.2022.3154319. Epub 2022 Dec 5.
8
Harnessing collective structure knowledge in data augmentation for graph neural networks.利用图神经网络中数据增强的集体结构知识。
Neural Netw. 2024 Dec;180:106651. doi: 10.1016/j.neunet.2024.106651. Epub 2024 Aug 23.
9
Weisfeiler-Lehman goes dynamic: An analysis of the expressive power of Graph Neural Networks for attributed and dynamic graphs.Weisfeiler-Lehman 走向动态:图神经网络在属性图和动态图上的表达能力分析。
Neural Netw. 2024 May;173:106213. doi: 10.1016/j.neunet.2024.106213. Epub 2024 Feb 28.
10
Multi-level attention pooling for graph neural networks: Unifying graph representations with multiple localities.图神经网络的多层次注意池化:统一具有多个局部性的图表示。
Neural Netw. 2022 Jan;145:356-373. doi: 10.1016/j.neunet.2021.11.001. Epub 2021 Nov 10.

引用本文的文献

1
Graph Geometric Algebra networks for graph representation learning.用于图表示学习的图几何代数网络。
Sci Rep. 2025 Jan 2;15(1):170. doi: 10.1038/s41598-024-84483-0.
2
A graph neural network framework for mapping histological topology in oral mucosal tissue.用于口腔黏膜组织中组织学拓扑结构映射的图神经网络框架。
BMC Bioinformatics. 2022 Nov 25;23(1):506. doi: 10.1186/s12859-022-05063-5.