Suppr超能文献

从聚类假设到图卷积:重新审视基于图的半监督学习

From Cluster Assumption to Graph Convolution: Graph-Based Semi-Supervised Learning Revisited.

作者信息

Wang Zheng, Ding Hongming, Pan Li, Li Jianhua, Gong Zhiguo, Yu Philip S

出版信息

IEEE Trans Neural Netw Learn Syst. 2024 Oct 7;PP. doi: 10.1109/TNNLS.2024.3454710.

Abstract

Graph-based semi-supervised learning (GSSL) has long been a research focus. Traditional methods are generally shallow learners, based on the cluster assumption. Recently, graph convolutional networks (GCNs) have become the predominant techniques for their promising performance. However, a critical question remains largely unanswered: why do deep GCNs encounter the oversmoothing problem, while traditional shallow GSSL methods do not, despite both progressing through the graph in a similar iterative manner? In this article, we theoretically discuss the relationship between these two types of methods in a unified optimization framework. One of the most intriguing findings is that, unlike traditional ones, typical GCNs may not effectively incorporate both graph structure and label information at each layer. Motivated by this, we propose three simple but powerful graph convolution methods. The first, optimized simple graph convolution (), is a supervised method, which guides the graph convolution process with labels. The others are two "no-learning" unsupervised methods: graph structure preserving graph convolution () and its multiscale version GGCM, both aiming to preserve the graph structure information during the convolution process. Finally, we conduct extensive experiments to show the effectiveness of our methods.

摘要

基于图的半监督学习(GSSL)长期以来一直是研究热点。传统方法通常是基于聚类假设的浅层学习器。近年来,图卷积网络(GCN)因其出色的性能成为了主流技术。然而,一个关键问题在很大程度上仍未得到解答:为什么深度GCN会遇到过平滑问题,而传统的浅层GSSL方法却不会,尽管两者都是以类似的迭代方式在图上进行处理?在本文中,我们在一个统一的优化框架中从理论上探讨这两种方法之间的关系。最有趣的发现之一是,与传统方法不同,典型的GCN可能无法在每一层有效地融合图结构和标签信息。受此启发,我们提出了三种简单但强大的图卷积方法。第一种是优化简单图卷积(),它是一种监督方法,利用标签来指导图卷积过程。另外两种是“无学习”的无监督方法:保持图结构的图卷积()及其多尺度版本GGCM,两者都旨在在卷积过程中保留图结构信息。最后,我们进行了大量实验来证明我们方法的有效性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验