• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

释放时间动态:提升变分图注意力

Temporal dynamics unleashed: Elevating variational graph attention.

作者信息

Molaei Soheila, Niknam Ghazaleh, Ghosheh Ghadeer O, Chauhan Vinod Kumar, Zare Hadi, Zhu Tingting, Pan Shirui, Clifton David A

机构信息

Department of Engineering Science, University of Oxford, United Kingdom.

Department of Data Science and Technology, University of Tehran, Iran.

出版信息

Knowl Based Syst. 2024 Sep 5;299:None. doi: 10.1016/j.knosys.2024.112110.

DOI:10.1016/j.knosys.2024.112110
PMID:39474470
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11513757/
Abstract

This research introduces the Variational Graph Attention Dynamics (VarGATDyn), addressing the complexities of dynamic graph representation learning, where existing models, tailored for static graphs, prove inadequate. VarGATDyn melds attention mechanisms with a Markovian assumption to surpass the challenges of maintaining temporal consistency and the extensive dataset requirements typical of RNN-based frameworks. It harnesses the strengths of the Variational Graph Auto-Encoder (VGAE) framework, Graph Attention Networks (GAT), and Gaussian Mixture Models (GMM) to adeptly navigate the temporal and structural intricacies of dynamic graphs. Through the strategic application of GMMs, the model handles multimodal patterns, thereby rectifying misalignments between prior and estimated posterior distributions. An innovative multiple-learning methodology bolsters the model's adaptability, leading to an encompassing and effective learning process. Empirical tests underscore VarGATDyn's dominance in dynamic link prediction across various datasets, highlighting its proficiency in capturing multimodal distributions and temporal dynamics.

摘要

本研究引入了变分图注意力动力学(VarGATDyn),以应对动态图表示学习的复杂性,现有为静态图量身定制的模型在这方面已证明不足。VarGATDyn将注意力机制与马尔可夫假设相结合,以克服维持时间一致性的挑战以及基于循环神经网络(RNN)的框架典型的大量数据集要求。它利用变分图自动编码器(VGAE)框架、图注意力网络(GAT)和高斯混合模型(GMM)的优势,巧妙地应对动态图的时间和结构复杂性。通过高斯混合模型的策略性应用,该模型处理多模态模式,从而纠正先验分布与估计后验分布之间的不一致。一种创新的多重学习方法增强了模型的适应性,从而带来一个全面且有效的学习过程。实证测试强调了VarGATDyn在跨各种数据集的动态链接预测中的优势,突出了其在捕捉多模态分布和时间动态方面的能力。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/cff679f0d984/gr8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/f33bb708900e/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/dd4806371330/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/66229a535664/gr3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/a7a5eb0f19b5/gr4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/37232fbf28ea/gr5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/45879f1a75c5/gr6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/9b97fde978fe/gr7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/cff679f0d984/gr8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/f33bb708900e/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/dd4806371330/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/66229a535664/gr3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/a7a5eb0f19b5/gr4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/37232fbf28ea/gr5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/45879f1a75c5/gr6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/9b97fde978fe/gr7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f07f/11513757/cff679f0d984/gr8.jpg

相似文献

1
Temporal dynamics unleashed: Elevating variational graph attention.释放时间动态:提升变分图注意力
Knowl Based Syst. 2024 Sep 5;299:None. doi: 10.1016/j.knosys.2024.112110.
2
DyVGRNN: DYnamic mixture Variational Graph Recurrent Neural Networks.DyVGRNN:动态混合变分图递归神经网络。
Neural Netw. 2023 Aug;165:596-610. doi: 10.1016/j.neunet.2023.05.048. Epub 2023 Jun 5.
3
Graph-based prediction of Protein-protein interactions with attributed signed graph embedding.基于属性有向图嵌入的蛋白质-蛋白质相互作用的图预测。
BMC Bioinformatics. 2020 Jul 21;21(1):323. doi: 10.1186/s12859-020-03646-8.
4
Variational graph auto-encoders for miRNA-disease association prediction.基于变分图自编码器的 miRNA-疾病关联预测。
Methods. 2021 Aug;192:25-34. doi: 10.1016/j.ymeth.2020.08.004. Epub 2020 Aug 13.
5
Dynamic Causal Explanation Based Diffusion-Variational Graph Neural Network for Spatiotemporal Forecasting.基于动态因果解释的扩散变分图神经网络用于时空预测
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):9524-9537. doi: 10.1109/TNNLS.2024.3415149. Epub 2025 May 2.
6
TransformerG2G: Adaptive time-stepping for learning temporal graph embeddings using transformers.TransformerG2G:使用转换器学习时间图嵌入的自适应时间步长。
Neural Netw. 2024 Apr;172:106086. doi: 10.1016/j.neunet.2023.12.040. Epub 2023 Dec 26.
7
Representation Learning for Dynamic Functional Connectivities via Variational Dynamic Graph Latent Variable Models.基于变分动态图潜在变量模型的动态功能连接性表示学习
Entropy (Basel). 2022 Jan 19;24(2):152. doi: 10.3390/e24020152.
8
Optimizing Variational Graph Autoencoder for Community Detection with Dual Optimization.通过双重优化优化变分图自动编码器以进行社区检测
Entropy (Basel). 2020 Feb 7;22(2):197. doi: 10.3390/e22020197.
9
Genome-scale enzymatic reaction prediction by variational graph autoencoders.通过变分图自动编码器进行全基因组规模的酶促反应预测。
bioRxiv. 2023 Mar 12:2023.03.08.531729. doi: 10.1101/2023.03.08.531729.
10
Dynamic network link prediction with node representation learning from graph convolutional networks.基于图卷积网络的节点表示学习的动态网络链路预测
Sci Rep. 2024 Jan 4;14(1):538. doi: 10.1038/s41598-023-50977-6.

本文引用的文献

1
Predicting CircRNA disease associations using novel node classification and link prediction models on Graph Convolutional Networks.基于图卷积网络使用新型节点分类和链接预测模型预测环状RNA与疾病的关联。
Methods. 2022 Feb;198:32-44. doi: 10.1016/j.ymeth.2021.10.008. Epub 2021 Nov 6.
2
Learning Graph Representations With Maximal Cliques.学习具有最大团的图表示。
IEEE Trans Neural Netw Learn Syst. 2023 Feb;34(2):1089-1096. doi: 10.1109/TNNLS.2021.3104901. Epub 2023 Feb 3.