• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于属性的流边缘分区与协调的分布式图神经网络训练。

Attribute-driven streaming edge partitioning with reconciliations for distributed graph neural network training.

机构信息

Zhejiang University, Hangzhou, China.

Zhejiang University, Hangzhou, China.

出版信息

Neural Netw. 2023 Aug;165:987-998. doi: 10.1016/j.neunet.2023.06.026. Epub 2023 Jun 28.

DOI:10.1016/j.neunet.2023.06.026
PMID:37467586
Abstract

Current distributed graph training frameworks evenly partition a large graph into small chunks to suit distributed storage, leverage a uniform interface to access neighbors, and train graph neural networks in a cluster of machines to update weights. Nevertheless, they consider a separate design of storage and training, resulting in huge communication costs for retrieving neighborhoods. During the storage phase, traditional heuristic graph partitioning not only suffers from memory overhead because of loading the full graph into the memory but also damages semantically related structures because of its neglecting meaningful node attributes. What is more, in the weight-update phase, directly averaging synchronization is difficult to tackle with heterogeneous local models where each machine's data are loaded from different subgraphs, resulting in slow convergence. To solve these problems, we propose a novel distributed graph training approach, attribute-driven streaming edge partitioning with reconciliations (ASEPR), where the local model loads only the subgraph stored on its own machine to make fewer communications. ASEPR firstly clusters nodes with similar attributes in the same partition to maintain semantic structure and keep multihop neighbor locality. Then streaming partitioning combined with attribute clustering is applied to subgraph assignment to alleviate memory overhead. After local graph neural network training on distributed machines, we deploy cross-layer reconciliation strategies for heterogeneous local models to improve the averaged global model by knowledge distillation and contrastive learning. Extensive experiments conducted on four large graph datasets on node classification and link prediction tasks show that our model outperforms DistDGL, with fewer resource requirements and up to quadruple the convergence speed.

摘要

当前的分布式图训练框架将大型图均匀地划分为小块,以适应分布式存储,利用统一的接口访问邻居,并在机器集群中训练图神经网络以更新权重。然而,它们考虑了存储和训练的单独设计,导致在检索邻域时会产生巨大的通信成本。在存储阶段,传统的启发式图划分不仅由于需要将整个图加载到内存中而导致内存开销大,而且由于忽略了有意义的节点属性,还会破坏语义相关的结构。更重要的是,在权重更新阶段,由于每个机器的数据都从不同的子图加载,因此直接进行平均同步对于具有异构局部模型的情况很难处理,这导致收敛速度较慢。为了解决这些问题,我们提出了一种新颖的分布式图训练方法,即具有协调的属性驱动流边分区(ASEPR),其中局部模型仅加载存储在其自身机器上的子图,从而减少通信次数。ASEPR 首先将具有相似属性的节点聚类到同一个分区中,以保持语义结构并保持多跳邻居的局部性。然后,将流分区与属性聚类结合应用于子图分配,以减轻内存开销。在分布式机器上进行局部图神经网络训练后,我们部署跨层协调策略来解决异构局部模型的问题,通过知识蒸馏和对比学习来提高平均全局模型的性能。在节点分类和链接预测任务的四个大型图数据集上进行的广泛实验表明,我们的模型在资源需求更少的情况下表现优于 DistDGL,并且收敛速度提高了四倍。

相似文献

1
Attribute-driven streaming edge partitioning with reconciliations for distributed graph neural network training.基于属性的流边缘分区与协调的分布式图神经网络训练。
Neural Netw. 2023 Aug;165:987-998. doi: 10.1016/j.neunet.2023.06.026. Epub 2023 Jun 28.
2
Local structure-aware graph contrastive representation learning.基于局部结构感知的图对比表示学习。
Neural Netw. 2024 Apr;172:106083. doi: 10.1016/j.neunet.2023.12.037. Epub 2023 Dec 27.
3
Distributed Optimization of Graph Convolutional Network Using Subgraph Variance.
IEEE Trans Neural Netw Learn Syst. 2024 Aug;35(8):10764-10775. doi: 10.1109/TNNLS.2023.3243904. Epub 2024 Aug 5.
4
SAMCL: Subgraph-Aligned Multiview Contrastive Learning for Graph Anomaly Detection.SAMCL:用于图异常检测的子图对齐多视图对比学习
IEEE Trans Neural Netw Learn Syst. 2025 Jan;36(1):1664-1676. doi: 10.1109/TNNLS.2023.3323274. Epub 2025 Jan 7.
5
Contrastive learning of graphs under label noise.图在标签噪声下的对比学习。
Neural Netw. 2024 Apr;172:106113. doi: 10.1016/j.neunet.2024.106113. Epub 2024 Jan 6.
6
PSA-GNN: An augmented GNN framework with priori subgraph knowledge.PSA-GNN:基于先验子图知识的增强图神经网络框架。
Neural Netw. 2024 May;173:106155. doi: 10.1016/j.neunet.2024.106155. Epub 2024 Feb 4.
7
Graph Representation Learning Based on Specific Subgraphs for Biomedical Interaction Prediction.基于特定子图的图表示学习用于生物医学相互作用预测
IEEE/ACM Trans Comput Biol Bioinform. 2024 Sep-Oct;21(5):1552-1564. doi: 10.1109/TCBB.2024.3402741. Epub 2024 Oct 9.
8
Subgraph-Aware Graph Kernel Neural Network for Link Prediction in Biological Networks.用于生物网络中链接预测的子图感知图核神经网络
IEEE J Biomed Health Inform. 2024 Jul;28(7):4373-4381. doi: 10.1109/JBHI.2024.3390092. Epub 2024 Jul 2.
9
multi-type neighbors enhanced global topology and pairwise attribute learning for drug-protein interaction prediction.用于药物-蛋白质相互作用预测的多类型邻居增强全局拓扑和成对属性学习
Brief Bioinform. 2022 Sep 20;23(5). doi: 10.1093/bib/bbac120.
10
Augmented Graph Neural Network with hierarchical global-based residual connections.基于层次全局残差连接的增强图神经网络。
Neural Netw. 2022 Jun;150:149-166. doi: 10.1016/j.neunet.2022.03.008. Epub 2022 Mar 10.