• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种用于知识蒸馏的对比增强表示归一化方法。

A contrast enhanced representation normalization approach to knowledge distillation.

作者信息

Bao Zhiqiang, Zhu Di, Du Leying, Li Yang

机构信息

School of Communication and Information Engineering, Xi'an University of Posts & Telecommunications, Xian, 710121, China.

School of Artificial Intelligence, Xi'an University of Posts & Telecommunications, Xian, 710121, China.

出版信息

Sci Rep. 2025 Apr 16;15(1):13197. doi: 10.1038/s41598-025-97699-5.

DOI:10.1038/s41598-025-97699-5
PMID:40240441
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC12003734/
Abstract

Within the research scope of knowledge distillation, contrastive representation distillation has achieved remarkable research results by introducing the Contrastive Representation Distillation Loss. However, previous research has relatively paid scant attention to the influence of factors at the input sample level. We observe that the existence of a large number of negative sample pairs in the knowledge transfer process leads to the issue of information redundancy. To mitigate this issue, we propose a representation normalization method and apply it to contrastive representation distillation. The method aims to reduce the information redundancy caused by negative sample pairs. Meanwhile, drawing on the idea of the Triplet Loss function in contrastive learning, we constructed a loss function and integrated it into the Contrastive Representation Distillation Loss to form the Contrast Enhanced Representation Normalization Distillation Loss. This new loss function aims to enhance the similarity between positive sample pairs and increase the distance between negative sample pairs. The experimental results demonstrate that the Contrast Enhanced Representation Normalization Distillation algorithm outperforms the Contrastive Representation Distillation algorithm on the CIFAR100 and ImageNet datasets, and shows remarkable performance compared with other state-of-the-art knowledge distillation methods.This not only enables the deployment of models on resource-constrained devices, but also demonstrates extensive potential application values in tasks such as image segmentation, providing strong support for related research and practical applications.

摘要

在知识蒸馏的研究范围内,对比表示蒸馏通过引入对比表示蒸馏损失取得了显著的研究成果。然而,以往的研究相对较少关注输入样本层面因素的影响。我们观察到,知识转移过程中大量负样本对的存在导致了信息冗余问题。为缓解这一问题,我们提出了一种表示归一化方法,并将其应用于对比表示蒸馏。该方法旨在减少负样本对造成的信息冗余。同时,借鉴对比学习中三元组损失函数的思想,我们构建了一个损失函数,并将其集成到对比表示蒸馏损失中,形成对比增强表示归一化蒸馏损失。这个新的损失函数旨在增强正样本对之间的相似度,并增大负样本对之间的距离。实验结果表明,对比增强表示归一化蒸馏算法在CIFAR100和ImageNet数据集上优于对比表示蒸馏算法,并且与其他先进的知识蒸馏方法相比表现出色。这不仅使得模型能够在资源受限的设备上部署,还在图像分割等任务中展现出广泛的潜在应用价值,为相关研究和实际应用提供了有力支持。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/4b5318146e5b/41598_2025_97699_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/ab15d8db1887/41598_2025_97699_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/995c43a1c87f/41598_2025_97699_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/9fd194039dba/41598_2025_97699_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/47220ae55557/41598_2025_97699_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/075901195b4c/41598_2025_97699_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/969b29c5fe8d/41598_2025_97699_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/f6d937631cd8/41598_2025_97699_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/3685f9d02d71/41598_2025_97699_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/4b5318146e5b/41598_2025_97699_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/ab15d8db1887/41598_2025_97699_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/995c43a1c87f/41598_2025_97699_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/9fd194039dba/41598_2025_97699_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/47220ae55557/41598_2025_97699_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/075901195b4c/41598_2025_97699_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/969b29c5fe8d/41598_2025_97699_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/f6d937631cd8/41598_2025_97699_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/3685f9d02d71/41598_2025_97699_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3650/12003734/4b5318146e5b/41598_2025_97699_Fig9_HTML.jpg

相似文献

1
A contrast enhanced representation normalization approach to knowledge distillation.一种用于知识蒸馏的对比增强表示归一化方法。
Sci Rep. 2025 Apr 16;15(1):13197. doi: 10.1038/s41598-025-97699-5.
2
DCCD: Reducing Neural Network Redundancy via Distillation.DCCD:通过蒸馏减少神经网络冗余
IEEE Trans Neural Netw Learn Syst. 2024 Jul;35(7):10006-10017. doi: 10.1109/TNNLS.2023.3238337. Epub 2024 Jul 8.
3
ABUS tumor segmentation via decouple contrastive knowledge distillation.通过解耦对比知识蒸馏进行 ABUS 肿瘤分割。
Phys Med Biol. 2023 Dec 26;69(1). doi: 10.1088/1361-6560/ad1274.
4
SimCVD: Simple Contrastive Voxel-Wise Representation Distillation for Semi-Supervised Medical Image Segmentation.SimCVD:用于半监督医学图像分割的简单对比体素级表示提取。
IEEE Trans Med Imaging. 2022 Sep;41(9):2228-2237. doi: 10.1109/TMI.2022.3161829. Epub 2022 Aug 31.
5
Uncertainty-Aware Contrastive Distillation for Incremental Semantic Segmentation.用于增量语义分割的不确定性感知对比蒸馏
IEEE Trans Pattern Anal Mach Intell. 2023 Feb;45(2):2567-2581. doi: 10.1109/TPAMI.2022.3163806. Epub 2023 Jan 6.
6
Self-supervised contrastive graph representation with node and graph augmentation.自监督对比图表示与节点和图增强。
Neural Netw. 2023 Oct;167:223-232. doi: 10.1016/j.neunet.2023.08.039. Epub 2023 Aug 24.
7
ScribSD+: Scribble-supervised medical image segmentation based on simultaneous multi-scale knowledge distillation and class-wise contrastive regularization.ScribSD+:基于同时多尺度知识蒸馏和类内对比正则化的涂鸦监督医学图像分割。
Comput Med Imaging Graph. 2024 Sep;116:102416. doi: 10.1016/j.compmedimag.2024.102416. Epub 2024 Jul 9.
8
CACL: Cluster-Aware Adversarial Contrastive Learning for Pathological Image Analysis.CACL:用于病理图像分析的聚类感知对抗对比学习
IEEE J Biomed Health Inform. 2025 Jul;29(7):5095-5108. doi: 10.1109/JBHI.2025.3552640.
9
Local contrastive loss with pseudo-label based self-training for semi-supervised medical image segmentation.基于伪标签自训练的局部对比损失的半监督医学图像分割。
Med Image Anal. 2023 Jul;87:102792. doi: 10.1016/j.media.2023.102792. Epub 2023 Mar 11.
10
Anatomy-Aware Contrastive Representation Learning for Fetal Ultrasound.用于胎儿超声的解剖学感知对比表示学习
Comput Vis ECCV. 2022 Oct;2022:422-436. doi: 10.1007/978-3-031-25066-8_23.

本文引用的文献

1
Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition.通过相互对比学习进行视觉识别的在线知识蒸馏
IEEE Trans Pattern Anal Mach Intell. 2023 Aug;45(8):10212-10227. doi: 10.1109/TPAMI.2023.3257878. Epub 2023 Jun 30.