• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

训练更深层图神经网络的技巧汇总:一项全面的基准研究

Bag of Tricks for Training Deeper Graph Neural Networks: A Comprehensive Benchmark Study.

作者信息

Chen Tianlong, Zhou Kaixiong, Duan Keyu, Zheng Wenqing, Wang Peihao, Hu Xia, Wang Zhangyang

出版信息

IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):2769-2781. doi: 10.1109/TPAMI.2022.3174515. Epub 2023 Feb 3.

DOI:10.1109/TPAMI.2022.3174515
PMID:35544513
Abstract

Training deep graph neural networks (GNNs) is notoriously hard. Besides the standard plights in training deep architectures such as vanishing gradients and overfitting, it also uniquely suffers from over-smoothing, information squashing, and so on, which limits their potential power for encoding the high-order neighbor structure in large-scale graphs. Although numerous efforts are proposed to address these limitations, such as various forms of skip connections, graph normalization, and random dropping, it is difficult to disentangle the advantages brought by a deep GNN architecture from those "tricks" necessary to train such an architecture. Moreover, the lack of a standardized benchmark with fair and consistent experimental settings poses an almost insurmountable obstacle to gauge the effectiveness of new mechanisms. In view of those, we present the first fair and reproducible benchmark dedicated to assessing the "tricks" of training deep GNNs. We categorize existing approaches, investigate their hyperparameter sensitivity, and unify the basic configuration. Comprehensive evaluations are then conducted on tens of representative graph datasets including the recent large-scale Open Graph Benchmark, with diverse deep GNN backbones. We demonstrate that an organic combo of initial connection, identity mapping, group and batch normalization attains the new state-of-the-art results for deep GNNs on large datasets. Codes are available: https://github.com/VITA-Group/Deep_GCN_Benchmarking.

摘要

训练深度图神经网络(GNN)非常困难。除了训练深度架构时常见的困境,如梯度消失和过拟合外,它还特别容易出现过度平滑、信息挤压等问题,这限制了它们在大规模图中编码高阶邻居结构的潜在能力。尽管人们提出了许多努力来解决这些限制,如各种形式的跳跃连接、图归一化和随机丢弃,但很难将深度GNN架构带来的优势与训练这种架构所需的那些“技巧”区分开来。此外,缺乏具有公平一致实验设置的标准化基准,这对评估新机制的有效性构成了几乎无法克服的障碍。鉴于此,我们提出了第一个公平且可重现的基准,专门用于评估训练深度GNN的“技巧”。我们对现有方法进行分类,研究它们的超参数敏感性,并统一基本配置。然后,我们在包括最近的大规模开放图基准在内的数十个代表性图数据集上,使用不同的深度GNN主干进行了全面评估。我们证明,初始连接、恒等映射、组归一化和批归一化的有机组合在大型数据集上为深度GNN取得了新的最优结果。代码可获取:https://github.com/VITA-Group/Deep_GCN_Benchmarking 。

相似文献

1
Bag of Tricks for Training Deeper Graph Neural Networks: A Comprehensive Benchmark Study.训练更深层图神经网络的技巧汇总:一项全面的基准研究
IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):2769-2781. doi: 10.1109/TPAMI.2022.3174515. Epub 2023 Feb 3.
2
Scalable deeper graph neural networks for high-performance materials property prediction.用于高性能材料性能预测的可扩展深度图神经网络
Patterns (N Y). 2022 Apr 27;3(5):100491. doi: 10.1016/j.patter.2022.100491. eCollection 2022 May 13.
3
Augmented Graph Neural Network with hierarchical global-based residual connections.基于层次全局残差连接的增强图神经网络。
Neural Netw. 2022 Jun;150:149-166. doi: 10.1016/j.neunet.2022.03.008. Epub 2022 Mar 10.
4
Multiphysical graph neural network (MP-GNN) for COVID-19 drug design.多物理图神经网络(MP-GNN)在 COVID-19 药物设计中的应用。
Brief Bioinform. 2022 Jul 18;23(4). doi: 10.1093/bib/bbac231.
5
Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks.用于图神经网络稀疏训练的综合图渐进式剪枝
IEEE Trans Neural Netw Learn Syst. 2024 Oct;35(10):14903-14917. doi: 10.1109/TNNLS.2023.3282049. Epub 2024 Oct 7.
6
Exploiting Neighbor Effect: Conv-Agnostic GNN Framework for Graphs With Heterophily.利用邻域效应:用于具有异质性的图的卷积不可知图神经网络框架
IEEE Trans Neural Netw Learn Syst. 2024 Oct;35(10):13383-13396. doi: 10.1109/TNNLS.2023.3267902. Epub 2024 Oct 7.
7
Auto-GNN: Neural architecture search of graph neural networks.自动图神经网络:图神经网络的神经架构搜索
Front Big Data. 2022 Nov 17;5:1029307. doi: 10.3389/fdata.2022.1029307. eCollection 2022.
8
Cancer drug response prediction with surrogate modeling-based graph neural architecture search.基于替代模型的图神经网络架构搜索的癌症药物反应预测。
Bioinformatics. 2023 Aug 1;39(8). doi: 10.1093/bioinformatics/btad478.
9
Automatic Design of Deep Graph Neural Networks With Decoupled Mode.具有解耦模式的深度图神经网络自动设计
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):7918-7930. doi: 10.1109/TNNLS.2024.3438609. Epub 2025 May 2.
10
Reinforced GNNs for Multiple Instance Learning.用于多实例学习的强化图神经网络
IEEE Trans Neural Netw Learn Syst. 2025 Apr;36(4):6693-6707. doi: 10.1109/TNNLS.2024.3392575. Epub 2025 Apr 4.

引用本文的文献

1
ADSTGCN: A Dynamic Adaptive Deeper Spatio-Temporal Graph Convolutional Network for Multi-Step Traffic Forecasting.ADSTGCN:一种用于多步交通流量预测的动态自适应深度时空图卷积网络。
Sensors (Basel). 2023 Aug 4;23(15):6950. doi: 10.3390/s23156950.