• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

对抗缓存训练:大规模图上的无监督归纳网络表示学习

Adversarial Caching Training: Unsupervised Inductive Network Representation Learning on Large-Scale Graphs.

作者信息

Chen Junyang, Gong Zhiguo, Wang Wei, Wang Cong, Xu Zhenghua, Lv Jianming, Li Xueliang, Wu Kaishun, Liu Weiwen

出版信息

IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):7079-7090. doi: 10.1109/TNNLS.2021.3084195. Epub 2022 Nov 30.

DOI:10.1109/TNNLS.2021.3084195
PMID:34111002
Abstract

Network representation learning (NRL) has far-reaching effects on data mining research, showing its importance in many real-world applications. NRL, also known as network embedding, aims at preserving graph structures in a low-dimensional space. These learned representations can be used for subsequent machine learning tasks, such as vertex classification, link prediction, and data visualization. Recently, graph convolutional network (GCN)-based models, e.g., GraphSAGE, have drawn a lot of attention for their success in inductive NRL. When conducting unsupervised learning on large-scale graphs, some of these models employ negative sampling (NS) for optimization, which encourages a target vertex to be close to its neighbors while being far from its negative samples. However, NS draws negative vertices through a random pattern or based on the degrees of vertices. Thus, the generated samples could be either highly relevant or completely unrelated to the target vertex. Moreover, as the training goes, the gradient of NS objective calculated with the inner product of the unrelated negative samples and the target vertex may become zero, which will lead to learning inferior representations. To address these problems, we propose an adversarial training method tailored for unsupervised inductive NRL on large networks. For efficiently keeping track of high-quality negative samples, we design a caching scheme with sampling and updating strategies that has a wide exploration of vertex proximity while considering training costs. Besides, the proposed method is adaptive to various existing GCN-based models without significantly complicating their optimization process. Extensive experiments show that our proposed method can achieve better performance compared with the state-of-the-art models.

摘要

网络表示学习(NRL)对数据挖掘研究具有深远影响,在许多实际应用中都显示出其重要性。NRL,也称为网络嵌入,旨在在低维空间中保留图结构。这些学习到的表示可用于后续的机器学习任务,如顶点分类、链接预测和数据可视化。最近,基于图卷积网络(GCN)的模型,如GraphSAGE,因其在归纳式NRL中的成功而备受关注。在对大规模图进行无监督学习时,其中一些模型采用负采样(NS)进行优化,这促使目标顶点靠近其邻居,同时远离其负样本。然而,NS通过随机模式或基于顶点的度来抽取负顶点。因此,生成的样本可能与目标顶点高度相关,也可能完全不相关。此外,随着训练的进行,用不相关的负样本与目标顶点的内积计算得到的NS目标的梯度可能变为零,这将导致学习到较差的表示。为了解决这些问题,我们提出了一种针对大型网络上无监督归纳式NRL量身定制的对抗训练方法。为了有效地跟踪高质量的负样本,我们设计了一种带有采样和更新策略的缓存方案,该方案在考虑训练成本的同时,对顶点接近度进行了广泛探索。此外,所提出的方法适用于各种现有的基于GCN的模型,而不会使其优化过程显著复杂化。大量实验表明,与现有最先进的模型相比,我们提出的方法能够取得更好的性能。

相似文献

1
Adversarial Caching Training: Unsupervised Inductive Network Representation Learning on Large-Scale Graphs.对抗缓存训练:大规模图上的无监督归纳网络表示学习
IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):7079-7090. doi: 10.1109/TNNLS.2021.3084195. Epub 2022 Nov 30.
2
Self-Training Enhanced: Network Embedding and Overlapping Community Detection With Adversarial Learning.自训练增强:基于对抗学习的网络嵌入和重叠社区检测。
IEEE Trans Neural Netw Learn Syst. 2022 Nov;33(11):6737-6748. doi: 10.1109/TNNLS.2021.3083318. Epub 2022 Oct 27.
3
WalkGAN: Network Representation Learning With Sequence-Based Generative Adversarial Networks.WalkGAN:基于序列的生成对抗网络的网络表示学习
IEEE Trans Neural Netw Learn Syst. 2024 Apr;35(4):5684-5694. doi: 10.1109/TNNLS.2022.3208914. Epub 2024 Apr 4.
4
CRL: Collaborative Representation Learning by Coordinating Topic Modeling and Network Embeddings.CRL:通过协调主题建模和网络嵌入进行协作表示学习
IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3765-3777. doi: 10.1109/TNNLS.2021.3054422. Epub 2022 Aug 3.
5
Learning Aligned Vertex Convolutional Networks for Graph Classification.用于图分类的学习对齐顶点卷积网络
IEEE Trans Neural Netw Learn Syst. 2024 Apr;35(4):4423-4437. doi: 10.1109/TNNLS.2021.3129649. Epub 2024 Apr 4.
6
Improving Network Representation Learning via Dynamic Random Walk, Self-Attention and Vertex Attributes-Driven Laplacian Space Optimization.通过动态随机游走、自注意力和顶点属性驱动的拉普拉斯空间优化改进网络表示学习
Entropy (Basel). 2022 Aug 30;24(9):1213. doi: 10.3390/e24091213.
7
Temporal network embedding framework with causal anonymous walks representations.具有因果匿名游走表示的时间网络嵌入框架。
PeerJ Comput Sci. 2022 Jan 20;8:e858. doi: 10.7717/peerj-cs.858. eCollection 2022.
8
MAMF-GCN: Multi-scale adaptive multi-channel fusion deep graph convolutional network for predicting mental disorder.MAMF-GCN:用于预测精神障碍的多尺度自适应多通道融合深度图卷积网络。
Comput Biol Med. 2022 Sep;148:105823. doi: 10.1016/j.compbiomed.2022.105823. Epub 2022 Jul 6.
9
Proximity-Based Compression for Network Embedding.基于邻近度的网络嵌入压缩
Front Big Data. 2021 Jan 26;3:608043. doi: 10.3389/fdata.2020.608043. eCollection 2020.
10
Signed random walk diffusion for effective representation learning in signed graphs.用于带符号图中有效表示学习的带符号随机游走扩散
PLoS One. 2022 Mar 17;17(3):e0265001. doi: 10.1371/journal.pone.0265001. eCollection 2022.

引用本文的文献

1
A Knowledge-Guided Graph Learning Approach Bridging Phenotype- and Target-Based Drug Discovery.一种连接基于表型和基于靶点的药物发现的知识引导图学习方法。
Adv Sci (Weinh). 2025 Apr;12(16):e2412402. doi: 10.1002/advs.202412402. Epub 2025 Mar 6.
2
An overview of video recommender systems: state-of-the-art and research issues.视频推荐系统概述:当前技术水平与研究问题
Front Big Data. 2023 Oct 30;6:1281614. doi: 10.3389/fdata.2023.1281614. eCollection 2023.
3
Histopathologic brain age estimation via multiple instance learning.基于多实例学习的脑病理年龄估计。
Acta Neuropathol. 2023 Dec;146(6):785-802. doi: 10.1007/s00401-023-02636-3. Epub 2023 Oct 10.