Shi Min, Qu Bo, Li Xiang, Li Cong
Adaptive Networks and Control Lab, Department of Electronic Engineering, School of Information Science and Technology, Fudan University, Shanghai, China.
Peng Cheng Laboratory, Shenzhen, China.
Front Physiol. 2022 May 27;13:910873. doi: 10.3389/fphys.2022.910873. eCollection 2022.
Previously network representation learning methods mainly focus on exploring the microscopic structure, , the pairwise relationship or similarity between nodes. However, the mesoscopic structure, , community structure, an essential property in real networks, has not been thoroughly studied in the network representation learning. We here propose a deep attributed network representation learning with community awareness (DANRL-CA) framework. Specifically, we design a neighborhood enhancement autoencoder module to capture the 2-step relations between node pairs. To explore the multi-step relations, we construct a community-aware skip-gram module based on the encoder. We introduce two variants of DANRL-CA, namely, DANRL-CA-AM and DANRL-CA-CSM, which incorporate the community information and attribute semantics into node neighbors with different methods. We compare two variant models with the state-of-the-art methods on four datasets for node classification and link prediction. Especially, we apply our models on a brain network. The superiority indicates the scalability and effectiveness of our method on various networks. Compared with DANRL-CA-AM, DANRL-CA-CSM can more flexibly coordinate the role of node attributes and community information in the process of network representation learning, and shows superiority in the networks with sparse topological structure and node attributes.
此前,网络表示学习方法主要侧重于探索微观结构,即节点之间的成对关系或相似性。然而,介观结构,即社区结构,作为真实网络中的一个基本属性,在网络表示学习中尚未得到充分研究。我们在此提出一种具有社区感知的深度属性网络表示学习(DANRL-CA)框架。具体而言,我们设计了一个邻域增强自动编码器模块来捕捉节点对之间的两步关系。为了探索多步关系,我们基于编码器构建了一个社区感知跳连接模型。我们介绍了DANRL-CA的两个变体,即DANRL-CA-AM和DANRL-CA-CSM,它们采用不同方法将社区信息和属性语义纳入节点邻居。我们在四个数据集上针对节点分类和链接预测将这两个变体模型与现有最先进方法进行比较。特别是,我们将我们的模型应用于一个脑网络。这种优越性表明了我们的方法在各种网络上的可扩展性和有效性。与DANRL-CA-AM相比,DANRL-CA-CSM在网络表示学习过程中能够更灵活地协调节点属性和社区信息的作用,并且在具有稀疏拓扑结构和节点属性的网络中表现出优越性。