• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过个性化PageRank实现有效的时态图学习

Effective Temporal Graph Learning via Personalized PageRank.

作者信息

Liao Ziyu, Liu Tao, He Yue, Lin Longlong

机构信息

College of Computer and Information Science, Southwest University, Chongqing 400715, China.

出版信息

Entropy (Basel). 2024 Jul 10;26(7):588. doi: 10.3390/e26070588.

DOI:10.3390/e26070588
PMID:39056950
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11275858/
Abstract

Graph representation learning aims to map nodes or edges within a graph using low-dimensional vectors, while preserving as much topological information as possible. During past decades, numerous algorithms for graph representation learning have emerged. Among them, proximity matrix representation methods have been shown to exhibit excellent performance in experiments and scale to large graphs with millions of nodes. However, with the rapid development of the Internet, information interactions are happening at the scale of billions every moment. Most methods for similarity matrix factorization still focus on static graphs, leading to incomplete similarity descriptions and low embedding quality. To enhance the embedding quality of temporal graph learning, we propose a temporal graph representation learning model based on the matrix factorization of Time-constrained Personalize PageRank (TPPR) matrices. TPPR, an extension of personalized PageRank (PPR) that incorporates temporal information, better captures node similarities in temporal graphs. Based on this, we use Single Value Decomposition or Nonnegative Matrix Factorization to decompose TPPR matrices to obtain embedding vectors for each node. Through experiments on tasks such as link prediction, node classification, and node clustering across multiple temporal graphs, as well as a comparison with various experimental methods, we find that graph representation learning algorithms based on TPPR matrix factorization achieve overall outstanding scores on multiple temporal datasets, highlighting their effectiveness.

摘要

图表示学习旨在使用低维向量对图中的节点或边进行映射,同时尽可能保留拓扑信息。在过去几十年中,出现了许多用于图表示学习的算法。其中,邻近矩阵表示方法在实验中表现出优异的性能,并且能够扩展到具有数百万节点的大型图。然而,随着互联网的快速发展,信息交互每时每刻都在数十亿的规模上发生。大多数相似性矩阵分解方法仍然专注于静态图,导致相似性描述不完整且嵌入质量较低。为了提高时态图学习的嵌入质量,我们提出了一种基于时间约束个性化PageRank(TPPR)矩阵分解的时态图表示学习模型。TPPR是个性化PageRank(PPR)的扩展,它纳入了时间信息,能更好地捕捉时态图中的节点相似性。基于此,我们使用奇异值分解或非负矩阵分解来分解TPPR矩阵,以获得每个节点的嵌入向量。通过对多个时态图上的链接预测、节点分类和节点聚类等任务进行实验,并与各种实验方法进行比较,我们发现基于TPPR矩阵分解的图表示学习算法在多个时态数据集上总体得分优异,突出了它们的有效性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8346/11275858/530b71d8ae9e/entropy-26-00588-g004a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8346/11275858/83ce8fbdb0b3/entropy-26-00588-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8346/11275858/01af880b77bc/entropy-26-00588-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8346/11275858/8c6656f1e0d2/entropy-26-00588-g003a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8346/11275858/530b71d8ae9e/entropy-26-00588-g004a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8346/11275858/83ce8fbdb0b3/entropy-26-00588-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8346/11275858/01af880b77bc/entropy-26-00588-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8346/11275858/8c6656f1e0d2/entropy-26-00588-g003a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8346/11275858/530b71d8ae9e/entropy-26-00588-g004a.jpg

相似文献

1
Effective Temporal Graph Learning via Personalized PageRank.通过个性化PageRank实现有效的时态图学习
Entropy (Basel). 2024 Jul 10;26(7):588. doi: 10.3390/e26070588.
2
Survey on graph embeddings and their applications to machine learning problems on graphs.关于图嵌入及其在图上机器学习问题中的应用的综述。
PeerJ Comput Sci. 2021 Feb 4;7:e357. doi: 10.7717/peerj-cs.357. eCollection 2021.
3
Graph Representation Learning and Its Applications: A Survey.图表示学习及其应用综述。
Sensors (Basel). 2023 Apr 21;23(8):4168. doi: 10.3390/s23084168.
4
TigeCMN: On exploration of temporal interaction graph embedding via Coupled Memory Neural Networks.TigeCMN:基于耦合记忆神经网络的时间交互图嵌入探索。
Neural Netw. 2021 Aug;140:13-26. doi: 10.1016/j.neunet.2021.02.016. Epub 2021 Mar 4.
5
Embedding graphs on Grassmann manifold.在 Grassmann 流形上嵌入图。
Neural Netw. 2022 Aug;152:322-331. doi: 10.1016/j.neunet.2022.05.001. Epub 2022 May 10.
6
ConTIG: Continuous representation learning on temporal interaction graphs.ConTIG:基于时间交互图的连续表示学习。
Neural Netw. 2024 Apr;172:106151. doi: 10.1016/j.neunet.2024.106151. Epub 2024 Jan 29.
7
DynG2G: An Efficient Stochastic Graph Embedding Method for Temporal Graphs.DynG2G:一种用于时态图的高效随机图嵌入方法。
IEEE Trans Neural Netw Learn Syst. 2022 Jun 10;PP. doi: 10.1109/TNNLS.2022.3178706.
8
Proximity-Based Compression for Network Embedding.基于邻近度的网络嵌入压缩
Front Big Data. 2021 Jan 26;3:608043. doi: 10.3389/fdata.2020.608043. eCollection 2020.
9
Graph Autoencoder with Preserving Node Attribute Similarity.具有保留节点属性相似性的图自动编码器
Entropy (Basel). 2023 Mar 26;25(4):567. doi: 10.3390/e25040567.
10
A Novel Representation Learning for Dynamic Graphs Based on Graph Convolutional Networks.基于图卷积网络的动态图新型表示学习
IEEE Trans Cybern. 2023 Jun;53(6):3599-3612. doi: 10.1109/TCYB.2022.3159661. Epub 2023 May 17.

本文引用的文献

1
The maximum capability of a topological feature in link prediction.链路预测中拓扑特征的最大能力。
PNAS Nexus. 2024 Mar 13;3(3):pgae113. doi: 10.1093/pnasnexus/pgae113. eCollection 2024 Mar.
2
CoarSAS2hvec: Heterogeneous Information Network Embedding with Balanced Network Sampling.CoarSAS2hvec:基于平衡网络采样的异构信息网络嵌入
Entropy (Basel). 2022 Feb 14;24(2):276. doi: 10.3390/e24020276.
3
Graph Neural Networks With Convolutional ARMA Filters.基于卷积 ARMA 滤波器的图神经网络。
IEEE Trans Pattern Anal Mach Intell. 2022 Jul;44(7):3496-3507. doi: 10.1109/TPAMI.2021.3054830. Epub 2022 Jun 3.
4
node2vec: Scalable Feature Learning for Networks.节点2向量:网络的可扩展特征学习
KDD. 2016 Aug;2016:855-864. doi: 10.1145/2939672.2939754.