Yu Xiangchun, Shen Jiaqing, Zhang Dingwen, Zheng Jian
Jiangxi Provincial Key Laboratory of Multidimensional Intelligent Perception and Control, School of Information Engineering, Jiangxi University of Science and Technology, Ganzhou, China.
Sci Rep. 2025 Apr 17;15(1):13231. doi: 10.1038/s41598-025-98116-7.
Self-knowledge distillation enables knowledge transfer by dynamically constructing the next-stage learning objectives, thus providing more effective path cues to optimize the compact student. The challenge lies in formulating effective learning objectives for the upcoming stage that mitigate the interference of inter-class similarity in medical image segmentation. This paper presents an Adversarial Class-Wise Self-Knowledge Distillation (ACW-SKD). ACW-SKD leverages an auxiliary head to generate coarse segmentation results, which are then utilized as prediction masks to refine class-wise features, followed by mitigating inter-class similarity via class-wise feature distillation. A feature reconstruction module (FRM) is introduced in the penultimate feature layer and class-wise feature layer to avoid plugging in multiple intermediate branches for constructing the next-stage learning objectives. Furthermore, adversarial temperature loss is incorporated to further recognize inter-class similarity by integrating a learnable temperature module. Extensive experiments are conducted on three benchmark datasets: Synapse, FLARE2022, and M2caiSeg. The results indicate that ACW-SKD surpasses several offline knowledge distillation methods, self-knowledge distillation methods, and U-Net networks. Ablation studies and visual analysis further validate the efficacy of ACW-SKD. This method notably enhances segmentation accuracy for challenging classes and mitigates the influence of inter-class similarities in medical image segmentation. Moreover, ACW-SKD delivers comparable results to U-Net with a reduced computational demand, positioning it as a viable option for deploying efficient medical image segmentation models on mobile devices. Our codes are available at https://github.com/shenjq77/ACW-SKD .
自知识蒸馏通过动态构建下一阶段的学习目标来实现知识转移,从而提供更有效的路径线索来优化紧凑的学生模型。挑战在于为即将到来的阶段制定有效的学习目标,以减轻医学图像分割中类间相似性的干扰。本文提出了一种对抗性逐类自知识蒸馏(ACW-SKD)方法。ACW-SKD利用一个辅助头生成粗略的分割结果,然后将其用作预测掩码来细化逐类特征,接着通过逐类特征蒸馏减轻类间相似性。在倒数第二层特征层和逐类特征层中引入了一个特征重建模块(FRM),以避免插入多个中间分支来构建下一阶段的学习目标。此外,引入了对抗性温度损失模块,通过集成一个可学习的温度模块来进一步识别类间相似性。在三个基准数据集上进行了广泛的实验:Synapse、FLARE2022和M2caiSeg。结果表明,ACW-SKD优于几种离线知识蒸馏方法、自知识蒸馏方法和U-Net网络。消融研究和可视化分析进一步验证了ACW-SKD的有效性。该方法显著提高了具有挑战性类别的分割精度,并减轻了医学图像分割中类间相似性的影响。此外,ACW-SKD在计算需求降低的情况下,与U-Net取得了相当的结果,使其成为在移动设备上部署高效医学图像分割模型的可行选择。我们的代码可在https://github.com/shenjq77/ACW-SKD获取。