• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

自监督对比图表示与节点和图增强。

Self-supervised contrastive graph representation with node and graph augmentation.

机构信息

School of Software, Yunnan University, Kunming 650500, China.

出版信息

Neural Netw. 2023 Oct;167:223-232. doi: 10.1016/j.neunet.2023.08.039. Epub 2023 Aug 24.

DOI:10.1016/j.neunet.2023.08.039
PMID:37660671
Abstract

Graph representation is a critical technology in the field of knowledge engineering and knowledge-based applications since most knowledge bases are represented in the graph structure. Nowadays, contrastive learning has become a prominent way for graph representation by contrasting positive-positive and positive-negative node pairs between two augmentation graphs. It has achieved new state-of-the-art in the field of self-supervised graph representation. However, existing contrastive graph representation methods mainly focus on modifying (normally removing some edges/nodes) the original graph structure to generate the augmentation graph for the contrastive. It inevitably changes the original graph structures, meaning the generated augmentation graph is no longer equivalent to the original graph. This harms the performance of the representation in many structure-sensitive graphs such as protein graphs, chemical graphs, molecular graphs, etc. Moreover, there is only one positive-positive node pair but relatively massive positive-negative node pairs in the self-supervised graph contrastive learning. This can lead to the same class, or very similar samples are considered negative samples. To this end, in this work, we propose a Virtual Masking Augmentation (VMA) to generate an augmentation graph without changing any structures from the original graph. Meanwhile, a node augmentation method is proposed to augment the positive node pairs by discovering the most similar nodes in the same graph. Then, two different augmentation graphs are generated and put into a contrastive learning model to learn the graph representation. Extensive experiments on massive datasets demonstrate that our method achieves new state-of-the-art results on self-supervised graph representation. The source code of the proposed method is available at https://github.com/DuanhaoranCC/CGRA.

摘要

图表示是知识工程和基于知识的应用领域的关键技术,因为大多数知识库都以图结构表示。如今,对比学习已成为图表示的一种突出方式,通过对比两个增强图中的正-正和正-负节点对。它在自监督图表示领域取得了新的最先进水平。然而,现有的对比图表示方法主要集中在修改(通常是删除一些边/节点)原始图结构以生成对比的增强图。这不可避免地改变了原始的图结构,意味着生成的增强图不再等同于原始图。这在许多结构敏感的图中损害了表示的性能,例如蛋白质图、化学图、分子图等。此外,在自监督图对比学习中只有一个正-正节点对,但相对大量的正-负节点对。这可能导致同一类或非常相似的样本被视为负样本。为此,在这项工作中,我们提出了一种虚拟掩蔽增强(VMA)方法,从原始图中不改变任何结构来生成增强图。同时,提出了一种节点增强方法,通过发现同一图中最相似的节点来增强正节点对。然后,生成两个不同的增强图,并将它们放入对比学习模型中学习图表示。在大量数据集上的广泛实验表明,我们的方法在自监督图表示方面取得了新的最先进的结果。该方法的源代码可在 https://github.com/DuanhaoranCC/CGRA 上获得。

相似文献

1
Self-supervised contrastive graph representation with node and graph augmentation.自监督对比图表示与节点和图增强。
Neural Netw. 2023 Oct;167:223-232. doi: 10.1016/j.neunet.2023.08.039. Epub 2023 Aug 24.
2
Attention-wise masked graph contrastive learning for predicting molecular property.基于注意力机制的掩码图对比学习预测分子性质。
Brief Bioinform. 2022 Sep 20;23(5). doi: 10.1093/bib/bbac303.
3
Community-CL: An Enhanced Community Detection Algorithm Based on Contrastive Learning.社区CL:一种基于对比学习的增强型社区检测算法。
Entropy (Basel). 2023 May 29;25(6):864. doi: 10.3390/e25060864.
4
Hierarchically Contrastive Hard Sample Mining for Graph Self-Supervised Pretraining.用于图自监督预训练的分层对比硬样本挖掘
IEEE Trans Neural Netw Learn Syst. 2024 Nov;35(11):16748-16761. doi: 10.1109/TNNLS.2023.3297607. Epub 2024 Oct 29.
5
TCGL: Temporal Contrastive Graph for Self-Supervised Video Representation Learning.TCGL:用于自监督视频表征学习的时间对比图
IEEE Trans Image Process. 2022;31:1978-1993. doi: 10.1109/TIP.2022.3147032. Epub 2022 Feb 18.
6
Local structure-aware graph contrastive representation learning.基于局部结构感知的图对比表示学习。
Neural Netw. 2024 Apr;172:106083. doi: 10.1016/j.neunet.2023.12.037. Epub 2023 Dec 27.
7
Prototypical Graph Contrastive Learning.典型的图对比学习
IEEE Trans Neural Netw Learn Syst. 2024 Feb;35(2):2747-2758. doi: 10.1109/TNNLS.2022.3191086. Epub 2024 Feb 5.
8
Accurate graph classification via two-staged contrastive curriculum learning.通过两阶段对比课程学习实现准确的图分类。
PLoS One. 2024 Jan 3;19(1):e0296171. doi: 10.1371/journal.pone.0296171. eCollection 2024.
9
BiMGCL: rumor detection bi-directional multi-level graph contrastive learning.BiMGCL:谣言检测的双向多层次图对比学习
PeerJ Comput Sci. 2023 Nov 10;9:e1659. doi: 10.7717/peerj-cs.1659. eCollection 2023.
10
Self-Supervised Node Representation Learning via Node-to-Neighbourhood Alignment.通过节点到邻域对齐的自监督节点表示学习
IEEE Trans Pattern Anal Mach Intell. 2024 Jun;46(6):4218-4233. doi: 10.1109/TPAMI.2024.3358541. Epub 2024 May 7.