Suppr超能文献

自步和自洽的半监督图像分割协同训练。

Self-paced and self-consistent co-training for semi-supervised image segmentation.

机构信息

Department of Software and IT Engineering, Ecole de technologie supérieure, Montreal, H3C1K3, Canada.

School of Software, Shandong University, Jinan, 250101, China.

出版信息

Med Image Anal. 2021 Oct;73:102146. doi: 10.1016/j.media.2021.102146. Epub 2021 Jun 26.

Abstract

Deep co-training has recently been proposed as an effective approach for image segmentation when annotated data is scarce. In this paper, we improve existing approaches for semi-supervised segmentation with a self-paced and self-consistent co-training method. To help distillate information from unlabeled images, we first design a self-paced learning strategy for co-training that lets jointly-trained neural networks focus on easier-to-segment regions first, and then gradually consider harder ones. This is achieved via an end-to-end differentiable loss in the form of a generalized Jensen Shannon Divergence (JSD). Moreover, to encourage predictions from different networks to be both consistent and confident, we enhance this generalized JSD loss with an uncertainty regularizer based on entropy. The robustness of individual models is further improved using a self-ensembling loss that enforces their prediction to be consistent across different training iterations. We demonstrate the potential of our method on three challenging image segmentation problems with different image modalities, using a small fraction of labeled data. Results show clear advantages in terms of performance compared to the standard co-training baselines and recently proposed state-of-the-art approaches for semi-supervised segmentation.

摘要

深度协同训练最近被提出作为一种在标注数据稀缺时进行图像分割的有效方法。在本文中,我们使用一种自步和自洽的协同训练方法改进了现有的半监督分割方法。为了帮助从未标注图像中提取信息,我们首先为协同训练设计了一种自步学习策略,该策略使联合训练的神经网络首先关注更容易分割的区域,然后逐渐考虑更难的区域。这是通过以广义 Jensen-Shannon 散度(JSD)的形式实现的端到端可区分损失来实现的。此外,为了鼓励来自不同网络的预测既一致又有信心,我们使用基于熵的不确定性正则化项增强了这个广义 JSD 损失。通过强制不同训练迭代之间预测一致的自集成损失进一步提高了单个模型的稳健性。我们使用少量标注数据,在三个具有不同图像模态的具有挑战性的图像分割问题上展示了我们方法的潜力。结果表明,与标准协同训练基线和最近提出的半监督分割的最先进方法相比,我们的方法在性能方面具有明显的优势。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验