• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于解缠表示转换的增量嵌入学习

Incremental Embedding Learning With Disentangled Representation Translation.

作者信息

Wei Kun, Chen Da, Li Yuhong, Yang Xu, Deng Cheng, Tao Dacheng

出版信息

IEEE Trans Neural Netw Learn Syst. 2024 Mar;35(3):3821-3833. doi: 10.1109/TNNLS.2022.3199816. Epub 2024 Feb 29.

DOI:10.1109/TNNLS.2022.3199816
PMID:36063529
Abstract

Humans are capable of accumulating knowledge by sequentially learning different tasks, while neural networks fail to achieve this due to catastrophic forgetting problems. Most current incremental learning methods focus more on tackling catastrophic forgetting for traditional classification networks. Notably, however, embedding networks that are basic architectures for many metric learning applications also suffer from this problem. Moreover, the most significant difficulty for continual embedding networks is that the relationships between the latent features and prototypes of previous tasks will be destroyed once new tasks have been learned. Accordingly, we propose a novel incremental method for embedding networks, called the disentangled representation translation (DRT) method, to obtain the discriminative class-disentangled features without reusing any samples of previous tasks and while avoiding the perturbation of task-related information. Next, a mask-guided module is specifically explored to adaptively change or retain the valuable information of latent features. This module enables us to effectively preserve the discriminative yet representative features in the disentangled translation process. In addition, DRT can easily be equipped with a regularization item of incremental learning to further improve performance. We conduct extensive experiments on four popular datasets; as the experimental results clearly demonstrate, our method can effectively alleviate the catastrophic forgetting problem for embedding networks.

摘要

人类能够通过依次学习不同任务来积累知识,而神经网络由于灾难性遗忘问题无法做到这一点。当前大多数增量学习方法更多地关注解决传统分类网络的灾难性遗忘问题。然而,值得注意的是,作为许多度量学习应用的基本架构的嵌入网络也存在这个问题。此外,持续嵌入网络最显著的困难在于,一旦学习了新任务,先前任务的潜在特征与原型之间的关系就会被破坏。因此,我们提出了一种用于嵌入网络的新颖增量方法,称为解缠表示转换(DRT)方法,以获得有判别力的类解缠特征,而无需重用先前任务的任何样本,同时避免任务相关信息的扰动。接下来,专门探索了一个掩码引导模块,以自适应地改变或保留潜在特征的有价值信息。该模块使我们能够在解缠转换过程中有效地保留有判别力且具代表性的特征。此外,DRT可以很容易地配备增量学习的正则化项以进一步提高性能。我们在四个流行数据集上进行了广泛实验;实验结果清楚地表明,我们的方法可以有效缓解嵌入网络的灾难性遗忘问题。

相似文献

1
Incremental Embedding Learning With Disentangled Representation Translation.基于解缠表示转换的增量嵌入学习
IEEE Trans Neural Netw Learn Syst. 2024 Mar;35(3):3821-3833. doi: 10.1109/TNNLS.2022.3199816. Epub 2024 Feb 29.
2
Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning.对抗性特征对齐:避免增量任务终身学习中的灾难性遗忘
Neural Comput. 2019 Nov;31(11):2266-2291. doi: 10.1162/neco_a_01232. Epub 2019 Sep 16.
3
Multi-granularity knowledge distillation and prototype consistency regularization for class-incremental learning.多粒度知识蒸馏和原型一致性正则化的类增量学习。
Neural Netw. 2023 Jul;164:617-630. doi: 10.1016/j.neunet.2023.05.006. Epub 2023 May 11.
4
Encoding primitives generation policy learning for robotic arm to overcome catastrophic forgetting in sequential multi-tasks learning.生成策略学习用于机械臂的编码基元,以克服顺序多任务学习中的灾难性遗忘。
Neural Netw. 2020 Sep;129:163-173. doi: 10.1016/j.neunet.2020.06.003. Epub 2020 Jun 5.
5
A class-incremental learning approach for learning feature-compatible embeddings.一种用于学习特征兼容嵌入的类增量学习方法。
Neural Netw. 2024 Dec;180:106685. doi: 10.1016/j.neunet.2024.106685. Epub 2024 Aug 31.
6
Triple-Memory Networks: A Brain-Inspired Method for Continual Learning.三记忆网络:一种受大脑启发的持续学习方法。
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):1925-1934. doi: 10.1109/TNNLS.2021.3111019. Epub 2022 May 2.
7
Overcoming Catastrophic Forgetting in Continual Learning by Exploring Eigenvalues of Hessian Matrix.通过探索海森矩阵的特征值克服持续学习中的灾难性遗忘
IEEE Trans Neural Netw Learn Syst. 2024 Nov;35(11):16196-16210. doi: 10.1109/TNNLS.2023.3292359. Epub 2024 Oct 29.
8
Incremental Learning for Online Data Using QR Factorization on Convolutional Neural Networks.基于卷积神经网络上的QR分解的在线数据增量学习
Sensors (Basel). 2023 Sep 27;23(19):8117. doi: 10.3390/s23198117.
9
Continual learning with attentive recurrent neural networks for temporal data classification.用于时态数据分类的基于注意力循环神经网络的持续学习
Neural Netw. 2023 Jan;158:171-187. doi: 10.1016/j.neunet.2022.10.031. Epub 2022 Nov 11.
10
Class-incremental learning with Balanced Embedding Discrimination Maximization.基于平衡嵌入判别最大化的增量式学习。
Neural Netw. 2024 Nov;179:106487. doi: 10.1016/j.neunet.2024.106487. Epub 2024 Jun 22.