• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过互信息最小化学习领域独立的深度表示。

Learning Domain-Independent Deep Representations by Mutual Information Minimization.

机构信息

College of Mathematics, Sichuan University, Chengdu 610065, China.

College of Cybersecurity, Sichuan University, Chengdu 610065, China.

出版信息

Comput Intell Neurosci. 2019 Jun 16;2019:9414539. doi: 10.1155/2019/9414539. eCollection 2019.

DOI:10.1155/2019/9414539
PMID:31316558
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6604496/
Abstract

Domain transfer learning aims to learn common data representations from a source domain and a target domain so that the source domain data can help the classification of the target domain. Conventional transfer representation learning imposes the distributions of source and target domain representations to be similar, which heavily relies on the characterization of the distributions of domains and the distribution matching criteria. In this paper, we proposed a novel framework for domain transfer representation learning. Our motive is to make the learned representations of data points independent from the domains which they belong to. In other words, from an optimal cross-domain representation of a data point, it is difficult to tell which domain it is from. In this way, the learned representations can be generalized to different domains. To measure the dependency between the representations and the corresponding domain which the data points belong to, we propose to use the mutual information between the representations and the domain-belonging indicators. By minimizing such mutual information, we learn the representations which are independent from domains. We build a classwise deep convolutional network model as a representation model and maximize the margin of each data point of the corresponding class, which is defined over the intraclass and interclass neighborhood. To learn the parameters of the model, we construct a unified minimization problem where the margins are maximized while the representation-domain mutual information is minimized. In this way, we learn representations which are not only discriminate but also independent from domains. An iterative algorithm based on the Adam optimization method is proposed to solve the minimization to learn the classwise deep model parameters and the cross-domain representations simultaneously. Extensive experiments over benchmark datasets show its effectiveness and advantage over existing domain transfer learning methods.

摘要

域迁移学习旨在从源域和目标域学习通用的数据表示,以便源域数据能够帮助目标域的分类。传统的迁移表示学习假设源域和目标域的表示分布相似,这严重依赖于域分布的特征化和分布匹配标准。在本文中,我们提出了一种新的域迁移表示学习框架。我们的动机是使数据点的学习表示独立于它们所属的域。换句话说,从数据点的最优跨域表示中,很难判断它来自哪个域。通过这种方式,学习到的表示可以推广到不同的域。为了衡量表示与数据点所属的相应域之间的依赖性,我们建议使用表示与域归属指示符之间的互信息来度量。通过最小化这种互信息,我们学习到与域无关的表示。我们构建了一个类别的深度卷积网络模型作为表示模型,并最大化每个对应类别的数据点的边界,该边界是在类内和类间邻域上定义的。为了学习模型的参数,我们构建了一个统一的最小化问题,在这个问题中,最大化边界的同时最小化表示-域互信息。通过这种方式,我们学习到的表示不仅具有判别性,而且与域无关。提出了一种基于 Adam 优化方法的迭代算法来解决最小化问题,以同时学习类别的深度模型参数和跨域表示。在基准数据集上的广泛实验表明了它的有效性和优于现有域迁移学习方法的优势。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/15732b347d30/CIN2019-9414539.alg.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/26fb587fef08/CIN2019-9414539.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/cdc01fde197a/CIN2019-9414539.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/22b74de79b61/CIN2019-9414539.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/fe265e0588f5/CIN2019-9414539.004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/3dbc17f4675d/CIN2019-9414539.005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/02ed45ef3c2b/CIN2019-9414539.006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/15732b347d30/CIN2019-9414539.alg.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/26fb587fef08/CIN2019-9414539.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/cdc01fde197a/CIN2019-9414539.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/22b74de79b61/CIN2019-9414539.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/fe265e0588f5/CIN2019-9414539.004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/3dbc17f4675d/CIN2019-9414539.005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/02ed45ef3c2b/CIN2019-9414539.006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9c98/6604496/15732b347d30/CIN2019-9414539.alg.001.jpg

相似文献

1
Learning Domain-Independent Deep Representations by Mutual Information Minimization.通过互信息最小化学习领域独立的深度表示。
Comput Intell Neurosci. 2019 Jun 16;2019:9414539. doi: 10.1155/2019/9414539. eCollection 2019.
2
Dual-Representation-Based Autoencoder for Domain Adaptation.基于双重表示的域自适应自动编码器。
IEEE Trans Cybern. 2022 Aug;52(8):7464-7477. doi: 10.1109/TCYB.2020.3040763. Epub 2022 Jul 19.
3
Joint Learning of Multiple Latent Domains and Deep Representations for Domain Adaptation.联合学习多个潜在领域和深度表示以进行领域自适应。
IEEE Trans Cybern. 2021 May;51(5):2676-2687. doi: 10.1109/TCYB.2019.2921559. Epub 2021 Apr 15.
4
Multi-source adaptation joint kernel sparse representation for visual classification.多源自适应联合核稀疏表示的视觉分类。
Neural Netw. 2016 Apr;76:135-151. doi: 10.1016/j.neunet.2016.01.008. Epub 2016 Feb 3.
5
Learning explicitly transferable representations for domain adaptation.学习可显式迁移的表示用于领域自适应。
Neural Netw. 2020 Oct;130:39-48. doi: 10.1016/j.neunet.2020.06.016. Epub 2020 Jun 25.
6
Simultaneously learning affinity matrix and data representations for machine fault diagnosis.同时学习亲和矩阵和数据表示以进行机器故障诊断。
Neural Netw. 2020 Feb;122:395-406. doi: 10.1016/j.neunet.2019.11.007. Epub 2019 Nov 22.
7
Modality independent adversarial network for generalized zero shot image classification.模态无关对抗网络的广义零样本图像分类。
Neural Netw. 2021 Feb;134:11-22. doi: 10.1016/j.neunet.2020.11.007. Epub 2020 Nov 21.
8
Domain Invariant and Class Discriminative Feature Learning for Visual Domain Adaptation.用于视觉域自适应的域不变和类判别特征学习。
IEEE Trans Image Process. 2018 Sep;27(9):4260-4273. doi: 10.1109/TIP.2018.2839528.
9
Feature Space Independent Semi-Supervised Domain Adaptation via Kernel Matching.基于核匹配的特征空间独立半监督域自适应。
IEEE Trans Pattern Anal Mach Intell. 2015 Jan;37(1):54-66. doi: 10.1109/TPAMI.2014.2343216.
10
Multi-Source Deep Transfer Neural Network Algorithm.多源深度迁移神经网络算法。
Sensors (Basel). 2019 Sep 16;19(18):3992. doi: 10.3390/s19183992.

本文引用的文献

1
Transferable Representation Learning with Deep Adaptation Networks.基于深度适应网络的可迁移表征学习
IEEE Trans Pattern Anal Mach Intell. 2019 Dec;41(12):3071-3085. doi: 10.1109/TPAMI.2018.2868685. Epub 2018 Sep 5.
2
Multiclass Informative Instance Transfer Learning Framework for Motor Imagery-Based Brain-Computer Interface.基于运动想象的脑机接口的多类信息实例迁移学习框架。
Comput Intell Neurosci. 2018 Feb 22;2018:6323414. doi: 10.1155/2018/6323414. eCollection 2018.
3
Selective Transfer Machine for Personalized Facial Expression Analysis.
用于个性化面部表情分析的选择性转移机器。
IEEE Trans Pattern Anal Mach Intell. 2017 Mar;39(3):529-545. doi: 10.1109/TPAMI.2016.2547397. Epub 2016 Mar 28.
4
Feature Space Independent Semi-Supervised Domain Adaptation via Kernel Matching.基于核匹配的特征空间独立半监督域自适应。
IEEE Trans Pattern Anal Mach Intell. 2015 Jan;37(1):54-66. doi: 10.1109/TPAMI.2014.2343216.
5
Context transfer in reinforcement learning using action-value functions.基于动作值函数的强化学习中的上下文转移
Comput Intell Neurosci. 2014;2014:428567. doi: 10.1155/2014/428567. Epub 2014 Dec 31.
6
Domain transfer multiple kernel learning.域迁移多核学习。
IEEE Trans Pattern Anal Mach Intell. 2012 Mar;34(3):465-79. doi: 10.1109/TPAMI.2011.114.