• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过高效原型重放和校准实现少样本类别增量学习

Few Shot Class Incremental Learning via Efficient Prototype Replay and Calibration.

作者信息

Zhang Wei, Gu Xiaodong

机构信息

Department of Electronic Engineering, Fudan University, Shanghai 200438, China.

出版信息

Entropy (Basel). 2023 May 10;25(5):776. doi: 10.3390/e25050776.

DOI:10.3390/e25050776
PMID:37238532
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10217101/
Abstract

Few shot class incremental learning (FSCIL) is an extremely challenging but valuable problem in real-world applications. When faced with novel few shot tasks in each incremental stage, it should take into account both catastrophic forgetting of old knowledge and overfitting of new categories with limited training data. In this paper, we propose an efficient prototype replay and calibration (EPRC) method with three stages to improve classification performance. We first perform effective pre-training with rotation and mix-up augmentations in order to obtain a strong backbone. Then a series of pseudo few shot tasks are sampled to perform meta-training, which enhances the generalization ability of both the feature extractor and projection layer and then helps mitigate the over-fitting problem of few shot learning. Furthermore, an even nonlinear transformation function is incorporated into the similarity computation to implicitly calibrate the generated prototypes of different categories and alleviate correlations among them. Finally, we replay the stored prototypes to relieve catastrophic forgetting and rectify prototypes to be more discriminative in the incremental-training stage via an explicit regularization within the loss function. The experimental results on CIFAR-100 and ImageNet demonstrate that our EPRC significantly boosts the classification performance compared with existing mainstream FSCIL methods.

摘要

少样本类别增量学习(FSCIL)在实际应用中是一个极具挑战性但很有价值的问题。在每个增量阶段面对新的少样本任务时,它既要考虑旧知识的灾难性遗忘,又要考虑在有限训练数据下新类别的过拟合问题。在本文中,我们提出了一种具有三个阶段的高效原型重放与校准(EPRC)方法来提高分类性能。我们首先通过旋转和混合增强进行有效的预训练,以获得一个强大的主干网络。然后采样一系列伪少样本任务进行元训练,这增强了特征提取器和投影层的泛化能力,进而有助于减轻少样本学习的过拟合问题。此外,在相似度计算中引入一个偶数非线性变换函数,以隐式校准不同类别的生成原型并减轻它们之间的相关性。最后,我们重放存储的原型以缓解灾难性遗忘,并在增量训练阶段通过损失函数内的显式正则化将原型校正得更具判别力。在CIFAR-100和ImageNet上的实验结果表明,与现有的主流FSCIL方法相比,我们的EPRC显著提高了分类性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/a461d7484976/entropy-25-00776-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/8c7ebb4d49df/entropy-25-00776-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/512ad8979e16/entropy-25-00776-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/84064b0dae04/entropy-25-00776-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/65488c01682f/entropy-25-00776-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/3070e39ef621/entropy-25-00776-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/25d06b5cb5f3/entropy-25-00776-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/81c8527f681d/entropy-25-00776-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/a461d7484976/entropy-25-00776-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/8c7ebb4d49df/entropy-25-00776-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/512ad8979e16/entropy-25-00776-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/84064b0dae04/entropy-25-00776-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/65488c01682f/entropy-25-00776-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/3070e39ef621/entropy-25-00776-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/25d06b5cb5f3/entropy-25-00776-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/81c8527f681d/entropy-25-00776-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ccb5/10217101/a461d7484976/entropy-25-00776-g008.jpg

相似文献

1
Few Shot Class Incremental Learning via Efficient Prototype Replay and Calibration.通过高效原型重放和校准实现少样本类别增量学习
Entropy (Basel). 2023 May 10;25(5):776. doi: 10.3390/e25050776.
2
A survey on few-shot class-incremental learning.基于小样本的类增量学习研究综述。
Neural Netw. 2024 Jan;169:307-324. doi: 10.1016/j.neunet.2023.10.039. Epub 2023 Oct 31.
3
Few-Shot Class-Incremental Learning for Medical Time Series Classification.用于医学时间序列分类的少样本类别增量学习
IEEE J Biomed Health Inform. 2023 Feb 22;PP. doi: 10.1109/JBHI.2023.3247861.
4
Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks.通过对多阶段任务进行采样实现少样本类别增量学习。
IEEE Trans Pattern Anal Mach Intell. 2023 Nov;45(11):12816-12831. doi: 10.1109/TPAMI.2022.3200865.
5
Mitigate forgetting in few-shot class-incremental learning using different image views.使用不同的图像视图减轻小样本增量学习中的遗忘。
Neural Netw. 2023 Aug;165:999-1009. doi: 10.1016/j.neunet.2023.06.043. Epub 2023 Jul 5.
6
DyCR: A Dynamic Clustering and Recovering Network for Few-Shot Class-Incremental Learning.DyCR:一种用于少样本类别增量学习的动态聚类与恢复网络
IEEE Trans Neural Netw Learn Syst. 2025 Apr;36(4):7116-7129. doi: 10.1109/TNNLS.2024.3394844. Epub 2025 Apr 4.
7
Model Attention Expansion for Few-Shot Class-Incremental Learning.用于少样本类别增量学习的模型注意力扩展
IEEE Trans Image Process. 2024;33:4419-4431. doi: 10.1109/TIP.2024.3434475. Epub 2024 Aug 6.
8
Memorizing Complementation Network for Few-Shot Class-Incremental Learning.用于少样本类别增量学习的记忆互补网络。
IEEE Trans Image Process. 2023;32:937-948. doi: 10.1109/TIP.2023.3236160. Epub 2023 Jan 23.
9
Learnable Distribution Calibration for Few-Shot Class-Incremental Learning.用于少样本类别增量学习的可学习分布校准
IEEE Trans Pattern Anal Mach Intell. 2023 Oct;45(10):12699-12706. doi: 10.1109/TPAMI.2023.3273291. Epub 2023 Sep 5.
10
Dynamic Support Network for Few-Shot Class Incremental Learning.用于少样本类别增量学习的动态支持网络
IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):2945-2951. doi: 10.1109/TPAMI.2022.3175849. Epub 2023 Feb 3.

本文引用的文献

1
How to Trust Unlabeled Data? Instance Credibility Inference for Few-Shot Learning.如何信任未标记的数据?小样本学习中的实例可信度推断。
IEEE Trans Pattern Anal Mach Intell. 2022 Oct;44(10):6240-6253. doi: 10.1109/TPAMI.2021.3086140. Epub 2022 Sep 14.
2
Overcoming catastrophic forgetting in neural networks.克服神经网络中的灾难性遗忘。
Proc Natl Acad Sci U S A. 2017 Mar 28;114(13):3521-3526. doi: 10.1073/pnas.1611835114. Epub 2017 Mar 14.
3
One-shot learning of object categories.物体类别的一次性学习。
IEEE Trans Pattern Anal Mach Intell. 2006 Apr;28(4):594-611. doi: 10.1109/TPAMI.2006.79.