• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于图像分类的内存高效类增量学习

Memory-Efficient Class-Incremental Learning for Image Classification.

作者信息

Zhao Hanbin, Wang Hui, Fu Yongjian, Wu Fei, Li Xi

出版信息

IEEE Trans Neural Netw Learn Syst. 2022 Oct;33(10):5966-5977. doi: 10.1109/TNNLS.2021.3072041. Epub 2022 Oct 5.

DOI:10.1109/TNNLS.2021.3072041
PMID:33939615
Abstract

With the memory-resource-limited constraints, class-incremental learning (CIL) usually suffers from the "catastrophic forgetting" problem when updating the joint classification model on the arrival of newly added classes. To cope with the forgetting problem, many CIL methods transfer the knowledge of old classes by preserving some exemplar samples into the size-constrained memory buffer. To utilize the memory buffer more efficiently, we propose to keep more auxiliary low-fidelity exemplar samples, rather than the original real-high-fidelity exemplar samples. Such a memory-efficient exemplar preserving scheme makes the old-class knowledge transfer more effective. However, the low-fidelity exemplar samples are often distributed in a different domain away from that of the original exemplar samples, that is, a domain shift. To alleviate this problem, we propose a duplet learning scheme that seeks to construct domain-compatible feature extractors and classifiers, which greatly narrows down the above domain gap. As a result, these low-fidelity auxiliary exemplar samples have the ability to moderately replace the original exemplar samples with a lower memory cost. In addition, we present a robust classifier adaptation scheme, which further refines the biased classifier (learned with the samples containing distillation label knowledge about old classes) with the help of the samples of pure true class labels. Experimental results demonstrate the effectiveness of this work against the state-of-the-art approaches. We will release the code, baselines, and training statistics for all models to facilitate future research.

摘要

在内存资源有限的约束下,类别增量学习(CIL)在新添加类别到来时更新联合分类模型时通常会遭受“灾难性遗忘”问题。为了解决遗忘问题,许多CIL方法通过将一些示例样本保存在大小受限的内存缓冲区中来转移旧类别的知识。为了更有效地利用内存缓冲区,我们建议保留更多辅助的低保真示例样本,而不是原始的高保真示例样本。这种内存高效的示例保留方案使旧类知识转移更加有效。然而,低保真示例样本通常分布在与原始示例样本不同的域中,即域偏移。为了缓解这个问题,我们提出了一种双联学习方案,旨在构建域兼容的特征提取器和分类器,这大大缩小了上述域差距。结果,这些低保真辅助示例样本有能力以较低的内存成本适度替代原始示例样本。此外,我们提出了一种鲁棒的分类器适应方案,该方案借助纯真实类标签的样本进一步优化有偏差的分类器(使用包含关于旧类别的蒸馏标签知识的样本进行学习)。实验结果证明了这项工作相对于现有方法的有效性。我们将发布所有模型的代码、基线和训练统计信息,以促进未来的研究。

相似文献

1
Memory-Efficient Class-Incremental Learning for Image Classification.用于图像分类的内存高效类增量学习
IEEE Trans Neural Netw Learn Syst. 2022 Oct;33(10):5966-5977. doi: 10.1109/TNNLS.2021.3072041. Epub 2022 Oct 5.
2
CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning.CKDF:用于稳健增量学习的级联知识蒸馏框架
IEEE Trans Image Process. 2022;31:3825-3837. doi: 10.1109/TIP.2022.3176130. Epub 2022 Jun 2.
3
Multi-granularity knowledge distillation and prototype consistency regularization for class-incremental learning.多粒度知识蒸馏和原型一致性正则化的类增量学习。
Neural Netw. 2023 Jul;164:617-630. doi: 10.1016/j.neunet.2023.05.006. Epub 2023 May 11.
4
Less confidence, less forgetting: Learning with a humbler teacher in exemplar-free Class-Incremental learning.缺乏自信,遗忘更少:在无范例的 Class-Incremental 学习中,向更谦逊的教师学习。
Neural Netw. 2024 Nov;179:106513. doi: 10.1016/j.neunet.2024.106513. Epub 2024 Jul 6.
5
Inherit With Distillation and Evolve With Contrast: Exploring Class Incremental Semantic Segmentation Without Exemplar Memory.通过蒸馏继承并通过对比进化:探索无范例记忆的类增量语义分割
IEEE Trans Pattern Anal Mach Intell. 2023 Oct;45(10):11932-11947. doi: 10.1109/TPAMI.2023.3273574. Epub 2023 Sep 5.
6
Balanced Destruction-Reconstruction Dynamics for Memory-Replay Class Incremental Learning.
IEEE Trans Image Process. 2024;33:4966-4981. doi: 10.1109/TIP.2024.3451932. Epub 2024 Sep 11.
7
Dual Balanced Class-Incremental Learning With im-Softmax and Angular Rectification.基于im-Softmax和角度校正的双平衡类增量学习
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):4437-4447. doi: 10.1109/TNNLS.2024.3368341. Epub 2025 Feb 28.
8
Class-Incremental Learning: A Survey.类增量学习:一项综述。
IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):9851-9873. doi: 10.1109/TPAMI.2024.3429383. Epub 2024 Nov 6.
9
Class-Incremental Learning on Video-Based Action Recognition by Distillation of Various Knowledge.基于视频的动作识别的类增量学习,通过各种知识的蒸馏。
Comput Intell Neurosci. 2022 Mar 24;2022:4879942. doi: 10.1155/2022/4879942. eCollection 2022.
10
Class incremental learning of remote sensing images based on class similarity distillation.基于类相似性蒸馏的遥感图像类别增量学习
PeerJ Comput Sci. 2023 Sep 27;9:e1583. doi: 10.7717/peerj-cs.1583. eCollection 2023.

引用本文的文献

1
Advancing autonomy through lifelong learning: a survey of autonomous intelligent systems.通过终身学习提升自主性:自主智能系统调查
Front Neurorobot. 2024 Apr 5;18:1385778. doi: 10.3389/fnbot.2024.1385778. eCollection 2024.
2
Novel Meta-Learning Techniques for the Multiclass Image Classification Problem.用于多类图像分类问题的新型元学习技术。
Sensors (Basel). 2022 Dec 20;23(1):9. doi: 10.3390/s23010009.
3
An Incremental Class-Learning Approach with Acoustic Novelty Detection for Acoustic Event Recognition.基于声学新颖性检测的增量式类学习方法在声学事件识别中的应用。
Sensors (Basel). 2021 Oct 5;21(19):6622. doi: 10.3390/s21196622.