• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

类增量学习:一项综述。

Class-Incremental Learning: A Survey.

作者信息

Zhou Da-Wei, Wang Qi-Wei, Qi Zhi-Hong, Ye Han-Jia, Zhan De-Chuan, Liu Ziwei

出版信息

IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):9851-9873. doi: 10.1109/TPAMI.2024.3429383. Epub 2024 Nov 6.

DOI:10.1109/TPAMI.2024.3429383
PMID:39012754
Abstract

Deep models, e.g., CNNs and Vision Transformers, have achieved impressive achievements in many vision tasks in the closed world. However, novel classes emerge from time to time in our ever-changing world, requiring a learning system to acquire new knowledge continually. Class-Incremental Learning (CIL) enables the learner to incorporate the knowledge of new classes incrementally and build a universal classifier among all seen classes. Correspondingly, when directly training the model with new class instances, a fatal problem occurs - the model tends to catastrophically forget the characteristics of former ones, and its performance drastically degrades. There have been numerous efforts to tackle catastrophic forgetting in the machine learning community. In this paper, we survey comprehensively recent advances in class-incremental learning and summarize these methods from several aspects. We also provide a rigorous and unified evaluation of 17 methods in benchmark image classification tasks to find out the characteristics of different algorithms empirically. Furthermore, we notice that the current comparison protocol ignores the influence of memory budget in model storage, which may result in unfair comparison and biased results. Hence, we advocate fair comparison by aligning the memory budget in evaluation, as well as several memory-agnostic performance measures.

摘要

深度模型,例如卷积神经网络(CNNs)和视觉Transformer,在封闭世界中的许多视觉任务中都取得了令人瞩目的成就。然而,在我们不断变化的世界中,新的类别不时出现,这就要求学习系统不断获取新知识。类别增量学习(CIL)使学习者能够逐步纳入新类别的知识,并在所有已见类别中构建一个通用分类器。相应地,当直接用新类别的实例训练模型时,会出现一个致命问题——模型往往会灾难性地忘记先前类别的特征,其性能会急剧下降。机器学习社区已经做出了许多努力来解决灾难性遗忘问题。在本文中,我们全面综述了类别增量学习的最新进展,并从几个方面总结了这些方法。我们还在基准图像分类任务中对17种方法进行了严格统一的评估,以实证地找出不同算法的特点。此外,我们注意到当前的比较协议忽略了模型存储中内存预算的影响,这可能导致不公平的比较和有偏差的结果。因此,我们主张通过在评估中对齐内存预算以及一些与内存无关的性能度量来进行公平比较。

相似文献

1
Class-Incremental Learning: A Survey.类增量学习:一项综述。
IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):9851-9873. doi: 10.1109/TPAMI.2024.3429383. Epub 2024 Nov 6.
2
Class-Incremental Learning Method With Fast Update and High Retainability Based on Broad Learning System.基于广义学习系统的具有快速更新和高保持性的类增量学习方法
IEEE Trans Neural Netw Learn Syst. 2024 Aug;35(8):11332-11345. doi: 10.1109/TNNLS.2023.3259016. Epub 2024 Aug 5.
3
A survey on few-shot class-incremental learning.基于小样本的类增量学习研究综述。
Neural Netw. 2024 Jan;169:307-324. doi: 10.1016/j.neunet.2023.10.039. Epub 2023 Oct 31.
4
Multi-granularity knowledge distillation and prototype consistency regularization for class-incremental learning.多粒度知识蒸馏和原型一致性正则化的类增量学习。
Neural Netw. 2023 Jul;164:617-630. doi: 10.1016/j.neunet.2023.05.006. Epub 2023 May 11.
5
Class-Incremental Learning: Survey and Performance Evaluation on Image Classification.类别增量学习:图像分类的综述与性能评估
IEEE Trans Pattern Anal Mach Intell. 2023 May;45(5):5513-5533. doi: 10.1109/TPAMI.2022.3213473. Epub 2023 Apr 3.
6
A comprehensive study of class incremental learning algorithms for visual tasks.面向视觉任务的类增量学习算法的综合研究。
Neural Netw. 2021 Mar;135:38-54. doi: 10.1016/j.neunet.2020.12.003. Epub 2020 Dec 8.
7
Mitigate forgetting in few-shot class-incremental learning using different image views.使用不同的图像视图减轻小样本增量学习中的遗忘。
Neural Netw. 2023 Aug;165:999-1009. doi: 10.1016/j.neunet.2023.06.043. Epub 2023 Jul 5.
8
Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks.通过对多阶段任务进行采样实现少样本类别增量学习。
IEEE Trans Pattern Anal Mach Intell. 2023 Nov;45(11):12816-12831. doi: 10.1109/TPAMI.2022.3200865.
9
A Continual Learning Survey: Defying Forgetting in Classification Tasks.持续学习调查:在分类任务中对抗遗忘
IEEE Trans Pattern Anal Mach Intell. 2022 Jul;44(7):3366-3385. doi: 10.1109/TPAMI.2021.3057446. Epub 2022 Jun 3.
10
CCSI: Continual Class-Specific Impression for data-free class incremental learning.CCSI:用于无数据类增量学习的持续类特定印象。
Med Image Anal. 2024 Oct;97:103239. doi: 10.1016/j.media.2024.103239. Epub 2024 Jun 15.

引用本文的文献

1
Dual-Stage Clean-Sample Selection for Incremental Noisy Label Learning.用于增量噪声标签学习的双阶段干净样本选择
Bioengineering (Basel). 2025 Jul 8;12(7):743. doi: 10.3390/bioengineering12070743.
2
Uncertainty aware domain incremental learning for cross domain depression detection.用于跨域抑郁症检测的不确定性感知域增量学习
Sci Rep. 2025 Jul 14;15(1):25344. doi: 10.1038/s41598-025-10917-y.
3
Exploring multi-granularity balance strategy for class incremental learning via three-way granular computing.基于三支粒计算探索类增量学习的多粒度平衡策略
Brain Inform. 2025 Mar 17;12(1):7. doi: 10.1186/s40708-025-00255-0.
4
ELM-KL-LSTM: a robust and general incremental learning method for efficient classification of time series data.ELM-KL-LSTM:一种用于时间序列数据高效分类的强大通用增量学习方法。
PeerJ Comput Sci. 2023 Dec 21;9:e1732. doi: 10.7717/peerj-cs.1732. eCollection 2023.