Wang Wenjian, Duan Lijuan, Wang Yuxi, Fan Junsong, Zhang Zhaoxiang
IEEE Trans Pattern Anal Mach Intell. 2023 Dec;45(12):15018-15035. doi: 10.1109/TPAMI.2023.3306352. Epub 2023 Nov 3.
Few-shot learning aims to recognize novel categories solely relying on a few labeled samples, with existing few-shot methods primarily focusing on the categories sampled from the same distribution. Nevertheless, this assumption cannot always be ensured, and the actual domain shift problem significantly reduces the performance of few-shot learning. To remedy this problem, we investigate an interesting and challenging cross-domain few-shot learning task, where the training and testing tasks employ different domains. Specifically, we propose a Meta-Memory scheme to bridge the domain gap between source and target domains, leveraging style-memory and content-memory components. The former stores intra-domain style information from source domain instances and provides a richer feature distribution. The latter stores semantic information through exploration of knowledge of different categories. Under the contrastive learning strategy, our model effectively alleviates the cross-domain problem in few-shot learning. Extensive experiments demonstrate that our proposed method achieves state-of-the-art performance on cross-domain few-shot semantic segmentation tasks on the COCO-20 , PASCAL-5 , FSS-1000, and SUIM datasets and positively affects few-shot classification tasks on Meta-Dataset.
少样本学习旨在仅依靠少量标记样本识别新类别,现有少样本方法主要关注从相同分布中采样的类别。然而,这种假设并非总能得到保证,实际的域转移问题会显著降低少样本学习的性能。为了解决这个问题,我们研究了一个有趣且具有挑战性的跨域少样本学习任务,其中训练和测试任务采用不同的域。具体来说,我们提出了一种元记忆方案,利用风格记忆和内容记忆组件来弥合源域和目标域之间的域差距。前者存储来自源域实例的域内风格信息,并提供更丰富的特征分布。后者通过探索不同类别的知识来存储语义信息。在对比学习策略下,我们的模型有效地缓解了少样本学习中的跨域问题。大量实验表明,我们提出的方法在COCO-20、PASCAL-5、FSS-1000和SUIM数据集上的跨域少样本语义分割任务中取得了领先的性能,并对Meta-Dataset上的少样本分类任务产生了积极影响。