IEEE Trans Med Imaging. 2023 Dec;42(12):3794-3804. doi: 10.1109/TMI.2023.3307892. Epub 2023 Nov 30.
Deep learning models have achieved remarkable success in multi-type nuclei segmentation. These models are mostly trained at once with the full annotation of all types of nuclei available, while lack the ability of continually learning new classes due to the problem of catastrophic forgetting. In this paper, we study the practical and important class-incremental continual learning problem, where the model is incrementally updated to new classes without accessing to previous data. We propose a novel continual nuclei segmentation method, to avoid forgetting knowledge of old classes and facilitate the learning of new classes, by achieving feature-level knowledge distillation with prototype-wise relation distillation and contrastive learning. Concretely, prototype-wise relation distillation imposes constraints on the inter-class relation similarity, encouraging the encoder to extract similar class distribution for old classes in the feature space. Prototype-wise contrastive learning with a hard sampling strategy enhances the intra-class compactness and inter-class separability of features, improving the performance on both old and new classes. Experiments on two multi-type nuclei segmentation benchmarks, i.e., MoNuSAC and CoNSeP, demonstrate the effectiveness of our method with superior performance over many competitive methods. Codes are available at https://github.com/zzw-szu/CoNuSeg.
深度学习模型在多类型细胞核分割方面取得了显著的成功。这些模型大多是使用所有类型细胞核的完整注释一次性训练的,但是由于灾难性遗忘问题,它们缺乏不断学习新类别的能力。在本文中,我们研究了实际且重要的类增量连续学习问题,其中模型在不访问以前数据的情况下逐步更新到新的类别。我们提出了一种新颖的连续细胞核分割方法,通过使用原型关系蒸馏和对比学习来实现特征级别的知识蒸馏,从而避免忘记旧类别的知识并促进新类别的学习。具体来说,原型关系蒸馏对类间关系相似度施加约束,鼓励编码器在特征空间中为旧类提取相似的类分布。带有硬采样策略的原型对比学习增强了特征的类内紧致性和类间可分离性,提高了新旧类别的性能。在两个多类型细胞核分割基准上的实验,即 MoNuSAC 和 CoNSeP,证明了我们的方法的有效性,其性能优于许多竞争方法。代码可在 https://github.com/zzw-szu/CoNuSeg 上获得。