• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过原型关系蒸馏和对比学习进行连续细胞核分割。

Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning.

出版信息

IEEE Trans Med Imaging. 2023 Dec;42(12):3794-3804. doi: 10.1109/TMI.2023.3307892. Epub 2023 Nov 30.

DOI:10.1109/TMI.2023.3307892
PMID:37610902
Abstract

Deep learning models have achieved remarkable success in multi-type nuclei segmentation. These models are mostly trained at once with the full annotation of all types of nuclei available, while lack the ability of continually learning new classes due to the problem of catastrophic forgetting. In this paper, we study the practical and important class-incremental continual learning problem, where the model is incrementally updated to new classes without accessing to previous data. We propose a novel continual nuclei segmentation method, to avoid forgetting knowledge of old classes and facilitate the learning of new classes, by achieving feature-level knowledge distillation with prototype-wise relation distillation and contrastive learning. Concretely, prototype-wise relation distillation imposes constraints on the inter-class relation similarity, encouraging the encoder to extract similar class distribution for old classes in the feature space. Prototype-wise contrastive learning with a hard sampling strategy enhances the intra-class compactness and inter-class separability of features, improving the performance on both old and new classes. Experiments on two multi-type nuclei segmentation benchmarks, i.e., MoNuSAC and CoNSeP, demonstrate the effectiveness of our method with superior performance over many competitive methods. Codes are available at https://github.com/zzw-szu/CoNuSeg.

摘要

深度学习模型在多类型细胞核分割方面取得了显著的成功。这些模型大多是使用所有类型细胞核的完整注释一次性训练的,但是由于灾难性遗忘问题,它们缺乏不断学习新类别的能力。在本文中,我们研究了实际且重要的类增量连续学习问题,其中模型在不访问以前数据的情况下逐步更新到新的类别。我们提出了一种新颖的连续细胞核分割方法,通过使用原型关系蒸馏和对比学习来实现特征级别的知识蒸馏,从而避免忘记旧类别的知识并促进新类别的学习。具体来说,原型关系蒸馏对类间关系相似度施加约束,鼓励编码器在特征空间中为旧类提取相似的类分布。带有硬采样策略的原型对比学习增强了特征的类内紧致性和类间可分离性,提高了新旧类别的性能。在两个多类型细胞核分割基准上的实验,即 MoNuSAC 和 CoNSeP,证明了我们的方法的有效性,其性能优于许多竞争方法。代码可在 https://github.com/zzw-szu/CoNuSeg 上获得。

相似文献

1
Continual Nuclei Segmentation via Prototype-Wise Relation Distillation and Contrastive Learning.通过原型关系蒸馏和对比学习进行连续细胞核分割。
IEEE Trans Med Imaging. 2023 Dec;42(12):3794-3804. doi: 10.1109/TMI.2023.3307892. Epub 2023 Nov 30.
2
ScribSD+: Scribble-supervised medical image segmentation based on simultaneous multi-scale knowledge distillation and class-wise contrastive regularization.ScribSD+:基于同时多尺度知识蒸馏和类内对比正则化的涂鸦监督医学图像分割。
Comput Med Imaging Graph. 2024 Sep;116:102416. doi: 10.1016/j.compmedimag.2024.102416. Epub 2024 Jul 9.
3
Adversarial class-wise self-knowledge distillation for medical image segmentation.用于医学图像分割的对抗性类别自知识蒸馏
Sci Rep. 2025 Apr 17;15(1):13231. doi: 10.1038/s41598-025-98116-7.
4
Inherit With Distillation and Evolve With Contrast: Exploring Class Incremental Semantic Segmentation Without Exemplar Memory.通过蒸馏继承并通过对比进化:探索无范例记忆的类增量语义分割
IEEE Trans Pattern Anal Mach Intell. 2023 Oct;45(10):11932-11947. doi: 10.1109/TPAMI.2023.3273574. Epub 2023 Sep 5.
5
CCSI: Continual Class-Specific Impression for data-free class incremental learning.CCSI:用于无数据类增量学习的持续类特定印象。
Med Image Anal. 2024 Oct;97:103239. doi: 10.1016/j.media.2024.103239. Epub 2024 Jun 15.
6
Uncertainty-Aware Contrastive Distillation for Incremental Semantic Segmentation.用于增量语义分割的不确定性感知对比蒸馏
IEEE Trans Pattern Anal Mach Intell. 2023 Feb;45(2):2567-2581. doi: 10.1109/TPAMI.2022.3163806. Epub 2023 Jan 6.
7
Multi-granularity knowledge distillation and prototype consistency regularization for class-incremental learning.多粒度知识蒸馏和原型一致性正则化的类增量学习。
Neural Netw. 2023 Jul;164:617-630. doi: 10.1016/j.neunet.2023.05.006. Epub 2023 May 11.
8
Complementary Calibration: Boosting General Continual Learning With Collaborative Distillation and Self-Supervision.互补校准:通过协作蒸馏和自监督提升通用持续学习能力
IEEE Trans Image Process. 2023;32:657-667. doi: 10.1109/TIP.2022.3230457. Epub 2023 Jan 6.
9
Privacy-Preserving Synthetic Continual Semantic Segmentation for Robotic Surgery.机器人手术中的隐私保护合成持续语义分割。
IEEE Trans Med Imaging. 2024 Jun;43(6):2291-2302. doi: 10.1109/TMI.2024.3364969. Epub 2024 Jun 3.
10
CL3: Generalization of Contrastive Loss for Lifelong Learning.CL3:用于终身学习的对比损失的泛化
J Imaging. 2023 Nov 23;9(12):259. doi: 10.3390/jimaging9120259.