• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

PKI:用于少样本类别增量学习的注入先验知识神经网络。

PKI: Prior knowledge-infused neural network for few-shot class-incremental learning.

作者信息

Bao Kexin, Lin Fanzhao, Wang Zichen, Li Yong, Zeng Dan, Ge Shiming

机构信息

Institute of Information Engineering, Chinese Academy of Sciences, Beijing, 100092, China; School of Cyber Security, University of Chinese Academy of Sciences, Beijing, 100049, China.

NARI Technology Company Limited, Nanjing, 210000, Jiangsu, China.

出版信息

Neural Netw. 2025 Jul 10;192:107724. doi: 10.1016/j.neunet.2025.107724.

DOI:10.1016/j.neunet.2025.107724
PMID:40674900
Abstract

Few-shot class-incremental learning (FSCIL) aims to continually adapt a model on a limited number of new-class examples, facing two well-known challenges: catastrophic forgetting and overfitting to new classes. Existing methods tend to freeze more parts of network components and finetune others with an extra memory during incremental sessions. These methods emphasize preserving prior knowledge to ensure proficiency in recognizing old classes, thereby mitigating catastrophic forgetting. Meanwhile, constraining fewer parameters can help in overcoming overfitting with the assistance of prior knowledge. Following previous methods, we retain more prior knowledge and propose a prior knowledge-infused neural network (PKI) to facilitate FSCIL. PKI consists of a backbone, an ensemble of projectors, a classifier, and an extra memory. In each incremental session, we build a new projector and add it to the ensemble. Subsequently, we finetune the new projector and the classifier jointly with other frozen network components, ensuring the rich prior knowledge is utilized effectively. By cascading projectors, PKI integrates prior knowledge accumulated from previous sessions and learns new knowledge flexibly, which helps to recognize old classes and efficiently learn new classes. Further, to reduce the resource consumption associated with keeping many projectors, we design two variants of the prior knowledge-infused neural network (PKIV-1 and PKIV-2) to trade off a balance between resource consumption and performance by reducing the number of projectors. Extensive experiments on three popular benchmarks demonstrate that our approach outperforms state-of-the-art methods.

摘要

少样本类别增量学习(FSCIL)旨在面对两个众所周知的挑战,即灾难性遗忘和对新类别的过拟合,在有限数量的新类示例上持续调整模型。现有方法在增量训练阶段往往会冻结网络组件的更多部分,并使用额外的内存对其他部分进行微调。这些方法强调保留先验知识以确保在识别旧类别方面的熟练度,从而减轻灾难性遗忘。同时,约束较少的参数有助于在先验知识的帮助下克服过拟合。遵循先前的方法,我们保留更多的先验知识,并提出一种注入先验知识的神经网络(PKI)来促进FSCIL。PKI由一个主干网络、一组投影器、一个分类器和一个额外的内存组成。在每个增量训练阶段,我们构建一个新的投影器并将其添加到投影器组中。随后,我们将新的投影器和分类器与其他冻结的网络组件一起进行联合微调,确保有效地利用丰富的先验知识。通过级联投影器,PKI整合了从先前训练阶段积累的先验知识,并灵活地学习新知识,这有助于识别旧类别并高效地学习新类别。此外,为了减少与保留多个投影器相关的资源消耗,我们设计了两种注入先验知识的神经网络变体(PKIV - 1和PKIV - 2),通过减少投影器的数量来在资源消耗和性能之间进行权衡。在三个流行基准上进行的广泛实验表明,我们的方法优于现有最先进的方法。

相似文献

1
PKI: Prior knowledge-infused neural network for few-shot class-incremental learning.PKI:用于少样本类别增量学习的注入先验知识神经网络。
Neural Netw. 2025 Jul 10;192:107724. doi: 10.1016/j.neunet.2025.107724.
2
Short-Term Memory Impairment短期记忆障碍
3
Brain-Inspired Fast-and Slow-Update Prompt Tuning for Few-Shot Class-Incremental Learning.用于少样本类别增量学习的脑启发式快速和慢速更新提示调优
IEEE Trans Neural Netw Learn Syst. 2024 Sep 18;PP. doi: 10.1109/TNNLS.2024.3454237.
4
Systemic pharmacological treatments for chronic plaque psoriasis: a network meta-analysis.系统性药理学治疗慢性斑块状银屑病:网络荟萃分析。
Cochrane Database Syst Rev. 2021 Apr 19;4(4):CD011535. doi: 10.1002/14651858.CD011535.pub4.
5
Point-cloud segmentation with in-silico data augmentation for prostate cancer treatment.用于前列腺癌治疗的基于计算机模拟数据增强的点云分割
Med Phys. 2025 Apr 3. doi: 10.1002/mp.17815.
6
Systemic pharmacological treatments for chronic plaque psoriasis: a network meta-analysis.慢性斑块状银屑病的全身药理学治疗:一项网状Meta分析。
Cochrane Database Syst Rev. 2020 Jan 9;1(1):CD011535. doi: 10.1002/14651858.CD011535.pub3.
7
PASS++: A Dual Bias Reduction Framework for Non-Exemplar Class-Incremental Learning.
IEEE Trans Pattern Anal Mach Intell. 2025 Aug;47(8):7123-7139. doi: 10.1109/TPAMI.2025.3568886.
8
The Black Book of Psychotropic Dosing and Monitoring.《精神药物剂量与监测黑皮书》
Psychopharmacol Bull. 2024 Jul 8;54(3):8-59.
9
The quantity, quality and findings of network meta-analyses evaluating the effectiveness of GLP-1 RAs for weight loss: a scoping review.评估胰高血糖素样肽-1受体激动剂(GLP-1 RAs)减肥效果的网状Meta分析的数量、质量及结果:一项范围综述
Health Technol Assess. 2025 Jun 25:1-73. doi: 10.3310/SKHT8119.
10
Long-acting inhaled therapy (beta-agonists, anticholinergics and steroids) for COPD: a network meta-analysis.慢性阻塞性肺疾病的长效吸入疗法(β受体激动剂、抗胆碱能药物和类固醇):一项网状荟萃分析。
Cochrane Database Syst Rev. 2014 Mar 26;2014(3):CD010844. doi: 10.1002/14651858.CD010844.pub2.