School of Computer Science and Engineering, Beihang University, Beijing, 100191, PR China.
School of Computer Science and Engineering, Beihang University, Beijing, 100191, PR China.
Neural Netw. 2024 Nov;179:106522. doi: 10.1016/j.neunet.2024.106522. Epub 2024 Jul 9.
Graph Neural Network (GNN) has achieved remarkable progress in the field of graph representation learning. The most prominent characteristic, propagating features along the edges, degrades its performance in most heterophilic graphs. Certain researches make attempts to construct KNN graph to improve the graph homophily. However, there is no prior knowledge to choose proper K and they may suffer from the problem of Inconsistent Similarity Distribution (ISD). To accommodate this issue, we propose Probability Graph Complementation Contrastive Learning (PGCCL) which adaptively constructs the complementation graph. We employ Beta Mixture Model (BMM) to distinguish intra-class similarity and inter-class similarity. Based on the posterior probability, we construct Probability Complementation Graphs to form contrastive views. The contrastive learning prompts the model to preserve complementary information for each node from different views. By combining original graph embedding and complementary graph embedding, the final embedding is able to capture rich semantics in the finetuning stage. At last, comprehensive experimental results on 20 datasets including homophilic and heterophilic graphs firmly verify the effectiveness of our algorithm as well as the quality of probability complementation graph compared with other state-of-the-art methods.
图神经网络(GNN)在图表示学习领域取得了显著的进展。其最显著的特点是沿边传播特征,这降低了它在大多数异质图中的性能。某些研究尝试构建 KNN 图以提高图同质性。然而,选择合适的 K 值并没有先验知识,并且它们可能会受到不一致相似性分布(ISD)的问题的影响。为了解决这个问题,我们提出了概率图补全对比学习(PGCCL),它自适应地构建补全图。我们采用 Beta 混合模型(BMM)来区分类内相似性和类间相似性。基于后验概率,我们构建概率补全图以形成对比视图。对比学习促使模型从不同视图保留每个节点的互补信息。通过结合原始图嵌入和补全图嵌入,最终的嵌入在微调阶段能够捕捉丰富的语义。最后,在包括同质图和异质图在内的 20 个数据集上的综合实验结果,有力地验证了我们的算法以及与其他最先进方法相比的概率补全图的质量的有效性。