Chen Jie, Chen Shouzhen, Bai Mingyuan, Pu Jian, Zhang Junping, Gao Junbin
IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):9859-9873. doi: 10.1109/TNNLS.2022.3161453. Epub 2023 Nov 30.
Graph neural networks (GNNs) have been ubiquitous in graph node classification tasks. Most GNN methods update the node embedding iteratively by aggregating its neighbors' information. However, they often suffer from negative disturbances, due to edges connecting nodes with different labels. One approach to alleviate this negative disturbance is to use attention to learn the weights of aggregation, but current attention-based GNNs only consider feature similarity and suffer from the lack of supervision. In this article, we consider label dependency of graph nodes and propose a decoupling attention mechanism to learn both hard and soft attention. The hard attention is learned on labels for a refined graph structure with fewer interclass edges so that the aggregation's negative disturbance can be reduced. The soft attention aims to learn the aggregation weights based on features over the refined graph structure to enhance information gains during message passing. Particularly, we formulate our model under the expectation-maximization (EM) framework, and the learned attention is used to guide label propagation in the M-step and feature propagation in the E-step, respectively. Extensive experiments are performed on six well-known benchmark graph datasets to verify the effectiveness of the proposed method.
图神经网络(GNN)在图节点分类任务中已无处不在。大多数GNN方法通过聚合其邻居的信息来迭代更新节点嵌入。然而,由于连接不同标签节点的边,它们经常受到负面干扰。一种减轻这种负面干扰的方法是使用注意力来学习聚合权重,但当前基于注意力的GNN仅考虑特征相似性,并且缺乏监督。在本文中,我们考虑图节点的标签依赖性,并提出一种解耦注意力机制来学习硬注意力和软注意力。硬注意力是在具有较少类间边的精细图结构的标签上学习的,以便可以减少聚合的负面干扰。软注意力旨在基于精细图结构上的特征学习聚合权重,以增强消息传递期间的信息增益。特别地,我们在期望最大化(EM)框架下构建我们的模型,并且所学习的注意力分别用于在M步中指导标签传播和在E步中指导特征传播。在六个著名的基准图数据集上进行了广泛的实验,以验证所提出方法的有效性。