Chen Zhaoliang, Wu Zhihao, Lin Zhenghong, Wang Shiping, Plant Claudia, Guo Wenzhong
IEEE Trans Neural Netw Learn Syst. 2024 Oct;35(10):13764-13776. doi: 10.1109/TNNLS.2023.3271623. Epub 2024 Oct 7.
Graph convolutional network (GCN) with the powerful capacity to explore graph-structural data has gained noticeable success in recent years. Nonetheless, most of the existing GCN-based models suffer from the notorious over-smoothing issue, owing to which shallow networks are extensively adopted. This may be problematic for complex graph datasets because a deeper GCN should be beneficial to propagating information across remote neighbors. Recent works have devoted effort to addressing over-smoothing problems, including establishing residual connection structure or fusing predictions from multilayer models. Because of the indistinguishable embeddings from deep layers, it is reasonable to generate more reliable predictions before conducting the combination of outputs from various layers. In light of this, we propose an alternating graph-regularized neural network (AGNN) composed of graph convolutional layer (GCL) and graph embedding layer (GEL). GEL is derived from the graph-regularized optimization containing Laplacian embedding term, which can alleviate the over-smoothing problem by periodic projection from the low-order feature space onto the high-order space. With more distinguishable features of distinct layers, an improved Adaboost strategy is utilized to aggregate outputs from each layer, which explores integrated embeddings of multi-hop neighbors. The proposed model is evaluated via a large number of experiments including performance comparison with some multilayer or multi-order graph neural networks, which reveals the superior performance improvement of AGNN compared with the state-of-the-art models.
近年来,具有强大的探索图结构数据能力的图卷积网络(GCN)取得了显著成功。尽管如此,大多数现有的基于GCN的模型都存在臭名昭著的过平滑问题,因此广泛采用浅层网络。对于复杂的图数据集来说,这可能存在问题,因为更深的GCN应该有利于在远程邻居之间传播信息。最近的工作致力于解决过平滑问题,包括建立残差连接结构或融合多层模型的预测结果。由于深层的嵌入难以区分,在对各层输出进行组合之前生成更可靠的预测是合理的。鉴于此,我们提出了一种由图卷积层(GCL)和图嵌入层(GEL)组成的交替图正则化神经网络(AGNN)。GEL源自包含拉普拉斯嵌入项的图正则化优化,它可以通过从低阶特征空间到高阶空间的周期性投影来缓解过平滑问题。由于不同层具有更可区分的特征,因此采用了一种改进的Adaboost策略来聚合各层的输出结果,该策略探索多跳邻居的集成嵌入。通过大量实验对所提出的模型进行了评估,包括与一些多层或多阶图神经网络的性能比较,结果表明AGNN与现有最先进模型相比具有卓越的性能提升。