• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

DREAM:一种用于无监督图域适应的双变分框架。

DREAM: A Dual Variational Framework for Unsupervised Graph Domain Adaptation.

作者信息

Yin Nan, Shen Li, Wang Mengzhu, Liu Xinwang, Chen Chong, Hua Xian-Sheng

出版信息

IEEE Trans Pattern Anal Mach Intell. 2025 Nov;47(11):10787-10800. doi: 10.1109/TPAMI.2025.3596054.

DOI:10.1109/TPAMI.2025.3596054
PMID:40763055
Abstract

Graph classification has been a prominent problem in graph machine learning fields. This problem has been investigated by leveraging message passing neural networks (MPNNs) to learn powerful graph representations. However, MPNNs extract topological semantics implicitly under label supervision, which could suffer from domain shift and label scarcity in unsupervised domain adaptation settings. In this paper, we propose an effective solution named Dual Variational Semantics Graph Mining (DREAM) for unsupervised graph domain adaptation by combining graph structural semantics from complementary perspectives. Besides a message passing branch to learn implicit semantics, our DREAM trains a path aggregation branch, which can provide explicit high-order structural semantics as a supplement. To train these two branches conjointly, we employ an expectation-maximization (EM) style variational framework for the maximization of likelihood. In the E-step, we fix the message passing branch and construct a graph-of-graph to indicate the geometric correlation between source and target domains, which would be adopted for the optimization of the other branch. In the M-step, we train the message passing branch and update the graph neural networks on the graph-of-graph with the other branch fixed. The alternative optimization improves the collaboration of knowledge from two branches. Extensive experiments on several benchmark datasets validate the superiority of the proposed DREAM compared with various baselines.

摘要

图分类一直是图机器学习领域中的一个突出问题。人们通过利用消息传递神经网络(MPNN)来学习强大的图表示,对这个问题进行了研究。然而,MPNN在标签监督下隐式地提取拓扑语义,在无监督域适应设置中可能会受到域转移和标签稀缺的影响。在本文中,我们提出了一种名为双变分语义图挖掘(DREAM)的有效解决方案,用于通过从互补的角度结合图结构语义进行无监督图域适应。除了一个用于学习隐式语义的消息传递分支外,我们的DREAM还训练了一个路径聚合分支,该分支可以提供显式的高阶结构语义作为补充。为了联合训练这两个分支,我们采用期望最大化(EM)风格的变分框架来最大化似然。在E步中,我们固定消息传递分支并构建一个图的图来表示源域和目标域之间的几何相关性,这将用于优化另一个分支。在M步中,我们训练消息传递分支并在固定另一个分支的情况下在图的图上更新图神经网络。交替优化提高了两个分支之间知识的协作。在几个基准数据集上进行的广泛实验验证了所提出的DREAM与各种基线相比的优越性。

相似文献

1
DREAM: A Dual Variational Framework for Unsupervised Graph Domain Adaptation.DREAM:一种用于无监督图域适应的双变分框架。
IEEE Trans Pattern Anal Mach Intell. 2025 Nov;47(11):10787-10800. doi: 10.1109/TPAMI.2025.3596054.
2
Structure enhanced prototypical alignment for unsupervised cross-domain node classification.结构增强原型对齐的无监督跨域节点分类。
Neural Netw. 2024 Sep;177:106396. doi: 10.1016/j.neunet.2024.106396. Epub 2024 May 18.
3
Graph Decoupling Attention Markov Networks for Semisupervised Graph Node Classification.用于半监督图节点分类的图解耦注意力马尔可夫网络
IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):9859-9873. doi: 10.1109/TNNLS.2022.3161453. Epub 2023 Nov 30.
4
Graph structure reforming framework enhanced by commute time distance for graph classification.基于 commute time distance 的图结构重构框架用于图分类。
Neural Netw. 2023 Nov;168:539-548. doi: 10.1016/j.neunet.2023.09.044. Epub 2023 Sep 26.
5
Unsupervised graph-level representation learning with hierarchical contrasts.基于分层对比的无监督图级表示学习
Neural Netw. 2023 Jan;158:359-368. doi: 10.1016/j.neunet.2022.11.019. Epub 2022 Nov 26.
6
GHNN: Graph Harmonic Neural Networks for semi-supervised graph-level classification.GHNN:用于半监督图级分类的图谐波神经网络。
Neural Netw. 2022 Jul;151:70-79. doi: 10.1016/j.neunet.2022.03.018. Epub 2022 Mar 24.
7
Learning to Model Graph Structural Information on MLPs via Graph Structure Self-Contrasting.通过图结构自对比学习在多层感知器上对图结构信息进行建模
IEEE Trans Neural Netw Learn Syst. 2024 Sep 30;PP. doi: 10.1109/TNNLS.2024.3458405.
8
Domain-adaptive message passing graph neural network.域自适应消息传递图神经网络。
Neural Netw. 2023 Jul;164:439-454. doi: 10.1016/j.neunet.2023.04.038. Epub 2023 May 3.
9
Deep Neural Message Passing With Hierarchical Layer Aggregation and Neighbor Normalization.具有层次化层聚合和邻居归一化的深度神经消息传递
IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):7172-7184. doi: 10.1109/TNNLS.2021.3084319. Epub 2022 Nov 30.
10
Heterogeneous Graph Attention Network for Unsupervised Multiple-Target Domain Adaptation.用于无监督多目标域自适应的异质图注意力网络。
IEEE Trans Pattern Anal Mach Intell. 2022 Apr;44(4):1992-2003. doi: 10.1109/TPAMI.2020.3026079. Epub 2022 Mar 4.