Suppr超能文献

用于少样本类别增量学习的脑启发式快速和慢速更新提示调优

Brain-Inspired Fast-and Slow-Update Prompt Tuning for Few-Shot Class-Incremental Learning.

作者信息

Ran Hang, Gao Xingyu, Li Lusi, Li Weijun, Tian Songsong, Wang Gang, Shi Hailong, Ning Xin

出版信息

IEEE Trans Neural Netw Learn Syst. 2024 Sep 18;PP. doi: 10.1109/TNNLS.2024.3454237.

Abstract

Few-shot class-incremental learning (FSCIL) aims to learn new classes incrementally with a limited number of samples per class. Foundation models combined with prompt tuning showcase robust generalization and zero-shot learning (ZSL) capabilities, endowing them with potential advantages in transfer capabilities for FSCIL. However, existing prompt tuning methods excel in optimizing for stationary datasets, diverging from the inherent sequential nature in the FSCIL paradigm. To address this issue, taking inspiration from the "fast and slow mechanism" of the complementary learning systems (CLSs) in the brain, we present fast-and slow-update prompt tuning FSCIL (FSPT-FSCIL), a brain-inspired prompt tuning method for transferring foundation models to the FSCIL task. We categorize the prompts into two groups: fast-update prompts and slow-update prompts, which are interactively trained through meta-learning. Fast-update prompts aim to learn new knowledge within a limited number of iterations, while slow-update prompts serve as meta-knowledge and aim to strike a balance between rapid learning and avoiding catastrophic forgetting. Through experiments on multiple benchmark tests, we demonstrate the effectiveness and superiority of FSPT-FSCIL. The code is available at https://github.com/qihangran/FSPT-FSCIL.

摘要

少样本类别增量学习(FSCIL)旨在以每个类别有限数量的样本增量式地学习新类别。基础模型与提示调整相结合展现出强大的泛化能力和零样本学习(ZSL)能力,使其在FSCIL的迁移能力方面具有潜在优势。然而,现有的提示调整方法在针对固定数据集进行优化方面表现出色,这与FSCIL范式中固有的顺序性质有所不同。为了解决这个问题,我们从大脑中互补学习系统(CLSs)的“快速和慢速机制”中获得灵感,提出了快速和慢速更新提示调整FSCIL(FSPT-FSCIL),这是一种受大脑启发的将基础模型迁移到FSCIL任务的提示调整方法。我们将提示分为两组:快速更新提示和慢速更新提示,它们通过元学习进行交互训练。快速更新提示旨在在有限的迭代次数内学习新知识,而慢速更新提示作为元知识,旨在在快速学习和避免灾难性遗忘之间取得平衡。通过在多个基准测试上的实验,我们证明了FSPT-FSCIL的有效性和优越性。代码可在https://github.com/qihangran/FSPT-FSCIL获取。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验