• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

过平滑的另一个视角:减轻深度图神经网络中的语义过平滑

Another Perspective of Over-Smoothing: Alleviating Semantic Over-Smoothing in Deep GNNs.

作者信息

Li Jin, Zhang Qirong, Liu Wenxi, Chan Antoni B, Fu Yang-Geng

出版信息

IEEE Trans Neural Netw Learn Syst. 2025 Apr;36(4):6897-6910. doi: 10.1109/TNNLS.2024.3402317. Epub 2025 Apr 4.

DOI:10.1109/TNNLS.2024.3402317
PMID:38809736
Abstract

Graph neural networks (GNNs) are widely used for analyzing graph-structural data and solving graph-related tasks due to their powerful expressiveness. However, existing off-the-shelf GNN-based models usually consist of no more than three layers. Deeper GNNs usually suffer from severe performance degradation due to several issues including the infamous "over-smoothing" issue, which restricts the further development of GNNs. In this article, we investigate the over-smoothing issue in deep GNNs. We discover that over-smoothing not only results in indistinguishable embeddings of graph nodes, but also alters and even corrupts their semantic structures, dubbed semantic over-smoothing. Existing techniques, e.g., graph normalization, aim at handling the former concern, but neglect the importance of preserving the semantic structures in the spatial domain, which hinders the further improvement of model performance. To alleviate the concern, we propose a cluster-keeping sparse aggregation strategy to preserve the semantic structure of embeddings in deep GNNs (especially for spatial GNNs). Particularly, our strategy heuristically redistributes the extent of aggregations for all the nodes from layers, instead of aggregating them equally, so that it enables aggregate concise yet meaningful information for deep layers. Without any bells and whistles, it can be easily implemented as a plug-and-play structure of GNNs via weighted residual connections. Last, we analyze the over-smoothing issue on the GNNs with weighted residual structures and conduct experiments to demonstrate the performance comparable to the state-of-the-arts.

摘要

图神经网络(GNN)因其强大的表达能力而被广泛用于分析图结构数据和解决与图相关的任务。然而,现有的基于现成GNN的模型通常由不超过三层组成。更深层次的GNN通常会由于包括臭名昭著的“过平滑”问题在内的几个问题而遭受严重的性能下降,这限制了GNN的进一步发展。在本文中,我们研究了深度GNN中的过平滑问题。我们发现过平滑不仅会导致图节点的嵌入无法区分,还会改变甚至破坏它们的语义结构,即语义过平滑。现有技术,如图归一化,旨在处理前一个问题,但忽略了在空间域中保留语义结构的重要性,这阻碍了模型性能的进一步提高。为了缓解这一问题,我们提出了一种保持聚类的稀疏聚合策略,以保留深度GNN(特别是空间GNN)中嵌入的语义结构。特别是,我们的策略启发式地重新分配了来自各层所有节点的聚合程度,而不是平均聚合它们,从而使深度层能够聚合简洁而有意义的信息。无需任何花里胡哨的东西,它可以通过加权残差连接轻松实现为GNN的即插即用结构。最后,我们分析了具有加权残差结构的GNN上的过平滑问题,并进行实验以证明其性能与最先进技术相当。

相似文献

1
Another Perspective of Over-Smoothing: Alleviating Semantic Over-Smoothing in Deep GNNs.过平滑的另一个视角:减轻深度图神经网络中的语义过平滑
IEEE Trans Neural Netw Learn Syst. 2025 Apr;36(4):6897-6910. doi: 10.1109/TNNLS.2024.3402317. Epub 2025 Apr 4.
2
A universal strategy for smoothing deceleration in deep graph neural networks.一种用于平滑深度图神经网络中减速的通用策略。
Neural Netw. 2025 May;185:107132. doi: 10.1016/j.neunet.2025.107132. Epub 2025 Jan 13.
3
DWSSA: Alleviating over-smoothness for deep Graph Neural Networks.DWSSA:减轻深度图神经网络的过度平滑问题
Neural Netw. 2024 Jun;174:106228. doi: 10.1016/j.neunet.2024.106228. Epub 2024 Mar 6.
4
DropAGG: Robust Graph Neural Networks via Drop Aggregation.DropAGG:通过随机聚合的鲁棒图神经网络。
Neural Netw. 2023 Jun;163:65-74. doi: 10.1016/j.neunet.2023.03.022. Epub 2023 Mar 29.
5
Augmented Graph Neural Network with hierarchical global-based residual connections.基于层次全局残差连接的增强图神经网络。
Neural Netw. 2022 Jun;150:149-166. doi: 10.1016/j.neunet.2022.03.008. Epub 2022 Mar 10.
6
GTC: GNN-Transformer co-contrastive learning for self-supervised heterogeneous graph representation.GTC:用于自监督异构图表示的GNN-Transformer协同对比学习
Neural Netw. 2025 Jan;181:106645. doi: 10.1016/j.neunet.2024.106645. Epub 2024 Aug 16.
7
Co-embedding of edges and nodes with deep graph convolutional neural networks.使用深度图卷积神经网络进行边和节点的联合嵌入
Sci Rep. 2023 Oct 8;13(1):16966. doi: 10.1038/s41598-023-44224-1.
8
Shared Growth of Graph Neural Networks via Prompted Free-Direction Knowledge Distillation.通过提示自由方向知识蒸馏实现图神经网络的共享增长
IEEE Trans Pattern Anal Mach Intell. 2025 Jun;47(6):4377-4394. doi: 10.1109/TPAMI.2025.3543211. Epub 2025 May 7.
9
AGNN: Alternating Graph-Regularized Neural Networks to Alleviate Over-Smoothing.AGNN:交替图正则化神经网络以减轻过平滑问题
IEEE Trans Neural Netw Learn Syst. 2024 Oct;35(10):13764-13776. doi: 10.1109/TNNLS.2023.3271623. Epub 2024 Oct 7.
10
Automatic Design of Deep Graph Neural Networks With Decoupled Mode.具有解耦模式的深度图神经网络自动设计
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):7918-7930. doi: 10.1109/TNNLS.2024.3438609. Epub 2025 May 2.