• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于 commute time distance 的图结构重构框架用于图分类。

Graph structure reforming framework enhanced by commute time distance for graph classification.

机构信息

School of Computer Science, Wuhan University, China; Changjiang Schinta Software Technology Co., LTD. Wuhan, China.

School of Computing and Information Systems, The University of Melbourne, Australia.

出版信息

Neural Netw. 2023 Nov;168:539-548. doi: 10.1016/j.neunet.2023.09.044. Epub 2023 Sep 26.

DOI:10.1016/j.neunet.2023.09.044
PMID:37837743
Abstract

As a graph data mining task, graph classification has high academic value and wide practical application. Among them, the graph neural network-based method is one of the mainstream methods. Most graph neural networks (GNNs) follow the message passing paradigm and can be called Message Passing Neural Networks (MPNNs), achieving good results in structural data-related tasks. However, it has also been reported that these methods suffer from over-squashing and limited expressive power. In recent years, many works have proposed different solutions to these problems separately, but none has yet considered these shortcomings in a comprehensive way. After considering these several aspects comprehensively, we identify two specific defects: information loss caused by local information aggregation, and an inability to capture higher-order structures. To solve these issues, we propose a plug-and-play framework based on Commute Time Distance (CTD), in which information is propagated in commute time distance neighborhoods. By considering both local and global graph connections, the commute time distance between two nodes is evaluated with reference to the path length and the number of paths in the whole graph. Moreover, the proposed framework CTD-MPNNs (Commute Time Distance-based Message Passing Neural Networks) can capture higher-order structural information by utilizing commute paths to enhance the expressive power of GNNs. Thus, our proposed framework can propagate and aggregate messages from defined important neighbors and model more powerful GNNs. We conduct extensive experiments using various real-world graph classification benchmarks. The experimental performance demonstrates the effectiveness of our framework. Codes are released on https://github.com/Haldate-Yu/CTD-MPNNs.

摘要

作为图数据挖掘任务,图分类具有很高的学术价值和广泛的实际应用。其中,基于图神经网络的方法是主流方法之一。大多数图神经网络(GNNs)遵循消息传递范例,可以称为消息传递神经网络(MPNNs),在与结构数据相关的任务中取得了很好的效果。然而,也有报道称这些方法存在过度压缩和表达能力有限的问题。近年来,许多工作分别提出了不同的解决方案来解决这些问题,但没有一个综合考虑了这些缺点。在全面考虑了这些方面之后,我们确定了两个具体的缺陷:局部信息聚合导致的信息丢失,以及无法捕获更高阶结构。为了解决这些问题,我们提出了一种基于交换时间距离(CTD)的即插即用框架,其中信息在交换时间距离邻域中传播。通过同时考虑局部和全局图连接,两个节点之间的交换时间距离参考整个图的路径长度和路径数量进行评估。此外,所提出的框架 CTD-MPNNs(基于交换时间距离的消息传递神经网络)可以通过利用交换路径来捕获高阶结构信息,从而增强 GNN 的表达能力。因此,我们提出的框架可以从定义的重要邻居传播和聚合消息,并构建更强大的 GNN。我们在各种真实世界的图分类基准上进行了广泛的实验。实验性能证明了我们框架的有效性。代码在 https://github.com/Haldate-Yu/CTD-MPNNs 上发布。

相似文献

1
Graph structure reforming framework enhanced by commute time distance for graph classification.基于 commute time distance 的图结构重构框架用于图分类。
Neural Netw. 2023 Nov;168:539-548. doi: 10.1016/j.neunet.2023.09.044. Epub 2023 Sep 26.
2
Augmented Graph Neural Network with hierarchical global-based residual connections.基于层次全局残差连接的增强图神经网络。
Neural Netw. 2022 Jun;150:149-166. doi: 10.1016/j.neunet.2022.03.008. Epub 2022 Mar 10.
3
Harnessing collective structure knowledge in data augmentation for graph neural networks.利用图神经网络中数据增强的集体结构知识。
Neural Netw. 2024 Dec;180:106651. doi: 10.1016/j.neunet.2024.106651. Epub 2024 Aug 23.
4
GTC: GNN-Transformer co-contrastive learning for self-supervised heterogeneous graph representation.GTC:用于自监督异构图表示的GNN-Transformer协同对比学习
Neural Netw. 2025 Jan;181:106645. doi: 10.1016/j.neunet.2024.106645. Epub 2024 Aug 16.
5
DropAGG: Robust Graph Neural Networks via Drop Aggregation.DropAGG:通过随机聚合的鲁棒图神经网络。
Neural Netw. 2023 Jun;163:65-74. doi: 10.1016/j.neunet.2023.03.022. Epub 2023 Mar 29.
6
PSA-GNN: An augmented GNN framework with priori subgraph knowledge.PSA-GNN:基于先验子图知识的增强图神经网络框架。
Neural Netw. 2024 May;173:106155. doi: 10.1016/j.neunet.2024.106155. Epub 2024 Feb 4.
7
DigGCN: Learning Compact Graph Convolutional Networks via Diffusion Aggregation.DigGCN:通过扩散聚合学习紧凑图卷积网络。
IEEE Trans Cybern. 2022 Feb;52(2):912-924. doi: 10.1109/TCYB.2020.2988791. Epub 2022 Feb 16.
8
TREPH: A Plug-In Topological Layer for Graph Neural Networks.TREPH:用于图神经网络的插件式拓扑层。
Entropy (Basel). 2023 Feb 10;25(2):331. doi: 10.3390/e25020331.
9
Chain-aware graph neural networks for molecular property prediction.基于链式感知图神经网络的分子性质预测。
Bioinformatics. 2024 Oct 1;40(10). doi: 10.1093/bioinformatics/btae574.
10
Co-embedding of edges and nodes with deep graph convolutional neural networks.使用深度图卷积神经网络进行边和节点的联合嵌入
Sci Rep. 2023 Oct 8;13(1):16966. doi: 10.1038/s41598-023-44224-1.