Tu Dunwei, Yi Huiyu, Zhang Tieyi, Li Ruotong, Shen Furao, Zhao Jian
National Key Laboratory for Novel Software Technology, Nanjing University, China; School of Artificial Intelligence, Nanjing University, Nanjing, 210023, China.
National Key Laboratory for Novel Software Technology, Nanjing University, China; School of Artificial Intelligence, Nanjing University, Nanjing, 210023, China.
Neural Netw. 2025 Oct;190:107608. doi: 10.1016/j.neunet.2025.107608. Epub 2025 May 27.
Few-shot class-incremental learning (FSCIL) aims to continually learn new classes from only a few samples without forgetting previous ones, requiring intelligent agents to adapt to dynamic environments. FSCIL combines the characteristics and challenges of class-incremental learning and few-shot learning: (i) Current classes occupy the entire feature space, which is detrimental to learning new classes. (ii) The small number of samples in incremental rounds is insufficient for fully training. In existing mainstream virtual class methods, to address the challenge (i), they attempt to use virtual classes as placeholders. However, new classes may not necessarily align with the virtual classes. For challenge (ii), they replace trainable fully connected layers with Nearest Class Mean (NCM) classifiers based on cosine similarity, but NCM classifiers do not account for sample imbalance issues. To address these issues in previous methods, we propose the class-center guided embedding Space Allocation with Angle-Norm joint classifiers (SAAN) learning framework, which provides balanced space for all classes and leverages norm differences caused by sample imbalance to enhance classification criteria. Specifically, for challenge (i), SAAN divides the feature space into multiple subspaces and allocates a dedicated subspace for each session by guiding samples with the pre-set category centers. For challenge (ii), SAAN establishes a norm distribution for each class and generates angle-norm joint logits. Experiments demonstrate that SAAN can achieve state-of-the-art performance and it can be directly embedded into other SOTA methods as a plug-in, further enhancing their performance.
少样本类别增量学习(FSCIL)旨在仅从少量样本中持续学习新类别,同时不遗忘先前学过的类别,这要求智能体适应动态环境。FSCIL结合了类别增量学习和少样本学习的特点与挑战:(i)当前类别占据了整个特征空间,这对学习新类别不利。(ii)增量轮次中的少量样本不足以进行充分训练。在现有的主流虚拟类别方法中,为应对挑战(i),它们尝试将虚拟类别用作占位符。然而,新类别不一定与虚拟类别对齐。对于挑战(ii),它们用基于余弦相似度的最近类均值(NCM)分类器取代可训练的全连接层,但NCM分类器没有考虑样本不平衡问题。为解决先前方法中的这些问题,我们提出了类别中心引导的嵌入空间分配与角度-范数联合分类器(SAAN)学习框架,该框架为所有类别提供平衡的空间,并利用样本不平衡导致的范数差异来增强分类标准。具体而言,对于挑战(i),SAAN将特征空间划分为多个子空间,并通过用预设的类别中心引导样本为每个会话分配一个专用子空间。对于挑战(ii),SAAN为每个类别建立一个范数分布,并生成角度-范数联合对数its。实验表明,SAAN可以实现最优性能,并且它可以作为插件直接嵌入到其他最优方法中,进一步提高它们的性能。