• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

CRL:通过协调主题建模和网络嵌入进行协作表示学习

CRL: Collaborative Representation Learning by Coordinating Topic Modeling and Network Embeddings.

作者信息

Chen Junyang, Gong Zhiguo, Wang Wei, Liu Weiwen, Dong Xiao

出版信息

IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3765-3777. doi: 10.1109/TNNLS.2021.3054422. Epub 2022 Aug 3.

DOI:10.1109/TNNLS.2021.3054422
PMID:33566768
Abstract

Network representation learning (NRL) has shown its effectiveness in many tasks, such as vertex classification, link prediction, and community detection. In many applications, vertices of social networks contain textual information, e.g., citation networks, which form a text corpus and can be applied to the typical representation learning methods. The global context in the text corpus can be utilized by topic models to discover the topic structures of vertices. Nevertheless, most existing NRL approaches focus on learning representations from the local neighbors of vertices and ignore the global structure of the associated textual information in networks. In this article, we propose a unified model based on matrix factorization (MF), named collaborative representation learning (CRL), which: 1) considers complementary global and local information simultaneously and 2) models topics and learns network embeddings collaboratively. Moreover, we incorporate the Fletcher-Reeves (FR) MF, a conjugate gradient method, to optimize the embedding matrices in an alternative mode. We call this parameter learning method as AFR in our work that can achieve convergence after a few numbers of iterations. Also, by evaluating CRL on topic coherence and vertex classification using several real-world data sets, our experimental study shows that this collaborative model not only can improve the performance of topic discovery over the baseline topic models but also can learn better network representations than the state-of-the-art context-aware NRL models.

摘要

网络表示学习(NRL)在许多任务中都显示出了有效性,如顶点分类、链接预测和社区检测。在许多应用中,社交网络的顶点包含文本信息,例如引文网络,这些文本信息构成了一个文本语料库,并且可以应用于典型的表示学习方法。主题模型可以利用文本语料库中的全局上下文来发现顶点的主题结构。然而,大多数现有的NRL方法都专注于从顶点的局部邻居学习表示,而忽略了网络中相关文本信息的全局结构。在本文中,我们提出了一种基于矩阵分解(MF)的统一模型,称为协同表示学习(CRL),该模型:1)同时考虑互补的全局和局部信息;2)协同建模主题并学习网络嵌入。此外,我们采用共轭梯度法Fletcher-Reeves(FR)MF以交替模式优化嵌入矩阵。在我们的工作中,我们将这种参数学习方法称为AFR,它可以在几次迭代后实现收敛。此外,通过使用几个真实世界的数据集对CRL进行主题连贯性和顶点分类评估,我们的实验研究表明,这种协同模型不仅可以在基线主题模型上提高主题发现的性能,而且可以比当前最先进的上下文感知NRL模型学习到更好的网络表示。

相似文献

1
CRL: Collaborative Representation Learning by Coordinating Topic Modeling and Network Embeddings.CRL:通过协调主题建模和网络嵌入进行协作表示学习
IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3765-3777. doi: 10.1109/TNNLS.2021.3054422. Epub 2022 Aug 3.
2
TACN: A Topical Adversarial Capsule Network for textual network embedding.TACN:一种用于文本网络嵌入的主题对抗胶囊网络。
Neural Netw. 2021 Dec;144:766-777. doi: 10.1016/j.neunet.2021.09.026. Epub 2021 Oct 6.
3
Self-Training Enhanced: Network Embedding and Overlapping Community Detection With Adversarial Learning.自训练增强:基于对抗学习的网络嵌入和重叠社区检测。
IEEE Trans Neural Netw Learn Syst. 2022 Nov;33(11):6737-6748. doi: 10.1109/TNNLS.2021.3083318. Epub 2022 Oct 27.
4
Adversarial Caching Training: Unsupervised Inductive Network Representation Learning on Large-Scale Graphs.对抗缓存训练:大规模图上的无监督归纳网络表示学习
IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):7079-7090. doi: 10.1109/TNNLS.2021.3084195. Epub 2022 Nov 30.
5
Fusion of text and graph information for machine learning problems on networks.用于网络机器学习问题的文本与图形信息融合
PeerJ Comput Sci. 2021 May 11;7:e526. doi: 10.7717/peerj-cs.526. eCollection 2021.
6
WalkGAN: Network Representation Learning With Sequence-Based Generative Adversarial Networks.WalkGAN:基于序列的生成对抗网络的网络表示学习
IEEE Trans Neural Netw Learn Syst. 2024 Apr;35(4):5684-5694. doi: 10.1109/TNNLS.2022.3208914. Epub 2024 Apr 4.
7
Multi-Task Network Representation Learning.多任务网络表示学习
Front Neurosci. 2020 Jan 23;14:1. doi: 10.3389/fnins.2020.00001. eCollection 2020.
8
Hypergraph Collaborative Network on Vertices and Hyperedges.基于顶点和超边的超图协作网络
IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):3245-3258. doi: 10.1109/TPAMI.2022.3178156. Epub 2023 Feb 3.
9
Improving Network Representation Learning via Dynamic Random Walk, Self-Attention and Vertex Attributes-Driven Laplacian Space Optimization.通过动态随机游走、自注意力和顶点属性驱动的拉普拉斯空间优化改进网络表示学习
Entropy (Basel). 2022 Aug 30;24(9):1213. doi: 10.3390/e24091213.
10
Context Attention Heterogeneous Network Embedding.上下文注意力异质网络嵌入。
Comput Intell Neurosci. 2019 Aug 21;2019:8106073. doi: 10.1155/2019/8106073. eCollection 2019.

引用本文的文献

1
Dual-branch collaborative learning network for crop disease identification.用于作物病害识别的双分支协同学习网络
Front Plant Sci. 2023 Feb 10;14:1117478. doi: 10.3389/fpls.2023.1117478. eCollection 2023.