Xie Yu, Jin Peixuan, Gong Maoguo, Zhang Chen, Yu Bin
School of Computer Science and Technology, Xidian University, Xi'an, China.
Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Electronic Engineering, Xidian University, Xi'an, China.
Front Neurosci. 2020 Jan 23;14:1. doi: 10.3389/fnins.2020.00001. eCollection 2020.
Networks, such as social networks, biochemical networks, and protein-protein interaction networks are ubiquitous in the real world. Network representation learning aims to embed nodes in a network as low-dimensional, dense, real-valued vectors, and facilitate downstream network analysis. The existing embedding methods commonly endeavor to capture structure information in a network, but lack of consideration of subsequent tasks and synergies between these tasks, which are of equal importance for learning desirable network representations. To address this issue, we propose a novel multi-task network representation learning (MTNRL) framework, which is end-to-end and more effective for underlying tasks. The original network and the incomplete network share a unified embedding layer followed by node classification and link prediction tasks that simultaneously perform on the embedding vectors. By optimizing the multi-task loss function, our framework jointly learns task-oriented embedding representations for each node. Besides, our framework is suitable for all network embedding methods, and the experiment results on several benchmark datasets demonstrate the effectiveness of the proposed framework compared with state-of-the-art methods.
诸如社交网络、生化网络和蛋白质-蛋白质相互作用网络等网络在现实世界中无处不在。网络表示学习旨在将网络中的节点嵌入为低维、密集、实值向量,并促进下游网络分析。现有的嵌入方法通常致力于捕捉网络中的结构信息,但缺乏对后续任务以及这些任务之间协同作用的考虑,而这些对于学习理想的网络表示同样重要。为了解决这个问题,我们提出了一种新颖的多任务网络表示学习(MTNRL)框架,该框架是端到端的,并且对基础任务更有效。原始网络和不完整网络共享一个统一的嵌入层,随后是在嵌入向量上同时执行的节点分类和链接预测任务。通过优化多任务损失函数,我们的框架为每个节点联合学习面向任务的嵌入表示。此外,我们的框架适用于所有网络嵌入方法,并且在几个基准数据集上的实验结果表明,与现有方法相比,所提出的框架是有效的。