• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于医学图像分割的卷积神经网络集成知识蒸馏

Knowledge distillation with ensembles of convolutional neural networks for medical image segmentation.

作者信息

Noothout Julia M H, Lessmann Nikolas, van Eede Matthijs C, van Harten Louis D, Sogancioglu Ecem, Heslinga Friso G, Veta Mitko, van Ginneken Bram, Išgum Ivana

机构信息

Amsterdam University Medical Center, University of Amsterdam, Department of Biomedical Engineering and Physics, Amsterdam, The Netherlands.

Radboud University Medical Center, Department of Medical Imaging, Nijmegen, The Netherlands.

出版信息

J Med Imaging (Bellingham). 2022 Sep;9(5):052407. doi: 10.1117/1.JMI.9.5.052407. Epub 2022 May 28.

DOI:10.1117/1.JMI.9.5.052407
PMID:35692896
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9142841/
Abstract

Ensembles of convolutional neural networks (CNNs) often outperform a single CNN in medical image segmentation tasks, but inference is computationally more expensive and makes ensembles unattractive for some applications. We compared the performance of differently constructed ensembles with the performance of CNNs derived from these ensembles using knowledge distillation, a technique for reducing the footprint of large models such as ensembles. We investigated two different types of ensembles, namely, diverse ensembles of networks with three different architectures and two different loss-functions, and uniform ensembles of networks with the same architecture but initialized with different random seeds. For each ensemble, additionally, a single student network was trained to mimic the class probabilities predicted by the teacher model, the ensemble. We evaluated the performance of each network, the ensembles, and the corresponding distilled networks across three different publicly available datasets. These included chest computed tomography scans with four annotated organs of interest, brain magnetic resonance imaging (MRI) with six annotated brain structures, and cardiac cine-MRI with three annotated heart structures. Both uniform and diverse ensembles obtained better results than any of the individual networks in the ensemble. Furthermore, applying knowledge distillation resulted in a single network that was smaller and faster without compromising performance compared with the ensemble it learned from. The distilled networks significantly outperformed the same network trained with reference segmentation instead of knowledge distillation. Knowledge distillation can compress segmentation ensembles of uniform or diverse composition into a single CNN while maintaining the performance of the ensemble.

摘要

卷积神经网络(CNN)集成在医学图像分割任务中通常比单个CNN表现更好,但推理的计算成本更高,这使得集成在某些应用中缺乏吸引力。我们使用知识蒸馏(一种减少诸如集成等大型模型占用空间的技术),将不同构造的集成的性能与从这些集成中派生的CNN的性能进行了比较。我们研究了两种不同类型的集成,即具有三种不同架构和两种不同损失函数的网络的多样化集成,以及具有相同架构但用不同随机种子初始化的网络的均匀集成。此外,对于每个集成,训练一个单个的学生网络来模仿教师模型(即集成)预测的类别概率。我们在三个不同的公开可用数据集上评估了每个网络、集成以及相应的蒸馏网络的性能。这些数据集包括带有四个感兴趣的注释器官的胸部计算机断层扫描、带有六个注释脑结构的脑磁共振成像(MRI)以及带有三个注释心脏结构的心脏电影MRI。均匀集成和多样化集成都比集成中的任何单个网络获得了更好的结果。此外,应用知识蒸馏得到了一个更小、更快的单个网络,其性能与它所学习的集成相比没有受损。蒸馏网络明显优于使用参考分割而不是知识蒸馏训练的相同网络。知识蒸馏可以将均匀或多样化组成的分割集成压缩成单个CNN,同时保持集成的性能。

相似文献

1
Knowledge distillation with ensembles of convolutional neural networks for medical image segmentation.用于医学图像分割的卷积神经网络集成知识蒸馏
J Med Imaging (Bellingham). 2022 Sep;9(5):052407. doi: 10.1117/1.JMI.9.5.052407. Epub 2022 May 28.
2
Semi-supervised learning for automatic segmentation of the knee from MRI with convolutional neural networks.基于卷积神经网络的膝关节 MRI 半自动分割的半监督学习。
Comput Methods Programs Biomed. 2020 Jun;189:105328. doi: 10.1016/j.cmpb.2020.105328. Epub 2020 Jan 11.
3
Graph Flow: Cross-Layer Graph Flow Distillation for Dual Efficient Medical Image Segmentation.图流:用于双高效医学图像分割的跨层图流蒸馏
IEEE Trans Med Imaging. 2023 Apr;42(4):1159-1171. doi: 10.1109/TMI.2022.3224459. Epub 2023 Apr 3.
4
Efficient skin lesion segmentation with boundary distillation.基于边界蒸馏的高效皮肤病变分割。
Med Biol Eng Comput. 2024 Sep;62(9):2703-2716. doi: 10.1007/s11517-024-03095-y. Epub 2024 May 1.
5
Knowledge distillation on individual vertebrae segmentation exploiting 3D U-Net.利用 3D U-Net 对个体椎体分割进行知识蒸馏。
Comput Med Imaging Graph. 2024 Apr;113:102350. doi: 10.1016/j.compmedimag.2024.102350. Epub 2024 Feb 8.
6
Automatic cardiac cine MRI segmentation and heart disease classification.自动心脏电影磁共振成像分割与心脏病分类。
Comput Med Imaging Graph. 2021 Mar;88:101864. doi: 10.1016/j.compmedimag.2021.101864. Epub 2021 Jan 13.
7
Deep cross-modality (MR-CT) educed distillation learning for cone beam CT lung tumor segmentation.用于锥形束CT肺肿瘤分割的深度跨模态(MR-CT)诱导蒸馏学习
Med Phys. 2021 Jul;48(7):3702-3713. doi: 10.1002/mp.14902. Epub 2021 May 25.
8
MSKD: Structured knowledge distillation for efficient medical image segmentation.MSKD:用于高效医学图像分割的结构化知识蒸馏。
Comput Biol Med. 2023 Sep;164:107284. doi: 10.1016/j.compbiomed.2023.107284. Epub 2023 Aug 2.
9
Efficient Medical Image Segmentation Based on Knowledge Distillation.基于知识蒸馏的高效医学图像分割。
IEEE Trans Med Imaging. 2021 Dec;40(12):3820-3831. doi: 10.1109/TMI.2021.3098703. Epub 2021 Nov 30.
10
Deep Ensembles Are Robust to Occasional Catastrophic Failures of Individual DNNs for Organs Segmentations in CT Images.深度学习集成对个体 DNN 在 CT 图像器官分割中偶尔出现的灾难性故障具有鲁棒性。
J Digit Imaging. 2023 Oct;36(5):2060-2074. doi: 10.1007/s10278-023-00857-2. Epub 2023 Jun 8.

引用本文的文献

1
Enhancement and evaluation for deep learning-based classification of volumetric neuroimaging with 3D-to-2D knowledge distillation.基于深度学习的容积神经影像学分类的增强和评估,使用 3D 到 2D 知识蒸馏。
Sci Rep. 2024 Nov 28;14(1):29611. doi: 10.1038/s41598-024-80938-6.
2
Special Section Guest Editorial: Advances in High-Dimensional Medical Image Processing.专题客座编辑:高维医学图像处理的进展
J Med Imaging (Bellingham). 2022 Sep;9(5):052401. doi: 10.1117/1.JMI.9.5.052401. Epub 2022 Oct 31.

本文引用的文献

1
Efficient knowledge distillation for liver CT segmentation using growing assistant network.使用生长辅助网络进行肝脏CT分割的高效知识蒸馏
Phys Med Biol. 2021 Nov 26;66(23). doi: 10.1088/1361-6560/ac3935.
2
Classification of diabetic retinopathy using unlabeled data and knowledge distillation.利用无标签数据和知识蒸馏对糖尿病性视网膜病变进行分类。
Artif Intell Med. 2021 Nov;121:102176. doi: 10.1016/j.artmed.2021.102176. Epub 2021 Sep 17.
3
Learning With Privileged Multimodal Knowledge for Unimodal Segmentation.基于特权多模态知识的单模态分割学习。
IEEE Trans Med Imaging. 2022 Mar;41(3):621-632. doi: 10.1109/TMI.2021.3119385. Epub 2022 Mar 2.
4
Efficient Medical Image Segmentation Based on Knowledge Distillation.基于知识蒸馏的高效医学图像分割。
IEEE Trans Med Imaging. 2021 Dec;40(12):3820-3831. doi: 10.1109/TMI.2021.3098703. Epub 2021 Nov 30.
5
AdaEn-Net: An ensemble of adaptive 2D-3D Fully Convolutional Networks for medical image segmentation.AdaEn-Net:一种用于医学图像分割的自适应 2D-3D 全卷积网络集成。
Neural Netw. 2020 Jun;126:76-94. doi: 10.1016/j.neunet.2020.03.007. Epub 2020 Mar 10.
6
Block Level Skip Connections Across Cascaded V-Net for Multi-Organ Segmentation.基于级联 V-Net 的多器官分割的块级跳连接。
IEEE Trans Med Imaging. 2020 Sep;39(9):2782-2793. doi: 10.1109/TMI.2020.2975347. Epub 2020 Feb 21.
7
Unpaired Multi-Modal Segmentation via Knowledge Distillation.基于知识蒸馏的非配对多模态分割。
IEEE Trans Med Imaging. 2020 Jul;39(7):2415-2425. doi: 10.1109/TMI.2019.2963882. Epub 2020 Feb 3.
8
Deep Learning Techniques for Medical Image Segmentation: Achievements and Challenges.深度学习技术在医学图像分割中的应用:成就与挑战。
J Digit Imaging. 2019 Aug;32(4):582-596. doi: 10.1007/s10278-019-00227-x.
9
Standardized Assessment of Automatic Segmentation of White Matter Hyperintensities and Results of the WMH Segmentation Challenge.标准化评估脑白质高信号的自动分割及其结果:脑白质高信号分割挑战赛。
IEEE Trans Med Imaging. 2019 Nov;38(11):2556-2568. doi: 10.1109/TMI.2019.2905770. Epub 2019 Mar 19.
10
Multiorgan segmentation using distance-aware adversarial networks.使用距离感知对抗网络的多器官分割
J Med Imaging (Bellingham). 2019 Jan;6(1):014001. doi: 10.1117/1.JMI.6.1.014001. Epub 2019 Jan 10.