• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通用标签噪声下的学习型学生网络

Learning Student Network Under Universal Label Noise.

作者信息

Tang Jialiang, Jiang Ning, Zhu Hongyuan, Tianyi Zhou Joey, Gong Chen

出版信息

IEEE Trans Image Process. 2024;33:4363-4376. doi: 10.1109/TIP.2024.3430539. Epub 2024 Aug 2.

DOI:10.1109/TIP.2024.3430539
PMID:39074017
Abstract

Data-free knowledge distillation aims to learn a small student network from a large pre-trained teacher network without the aid of original training data. Recent works propose to gather alternative data from the Internet for training student network. In a more realistic scenario, the data on the Internet contains two types of label noise, namely: 1) closed-set label noise, where some examples belong to the known categories but are mislabeled; and 2) open-set label noise, where the true labels of some mislabeled examples are outside the known categories. However, the latter is largely ignored by existing works, leading to limited student network performance. Therefore, this paper proposes a novel data-free knowledge distillation paradigm by utilizing a webly-collected dataset under universal label noise, which means both closed-set and open-set label noise should be tackled. Specifically, we first split the collected noisy dataset into clean set, closed noisy set, and open noisy set based on the prediction uncertainty of various data types. For the closed-set noisy examples, their labels are refined by teacher network. Meanwhile, a noise-robust hybrid contrastive learning is performed on the clean set and refined closed noisy set to encourage student network to learn the categorical and instance knowledge inherited by teacher network. For the open-set noisy examples unexplored by previous work, we regard them as unlabeled and conduct self-supervised learning on them to enrich the supervision signal for student network. Intensive experimental results on image classification tasks demonstrate that our approach can achieve superior performance to state-of-the-art data-free knowledge distillation methods.

摘要

无数据知识蒸馏旨在在无需原始训练数据的情况下,从大型预训练教师网络中学习一个小型学生网络。近期的工作提出从互联网收集替代数据来训练学生网络。在更现实的场景中,互联网上的数据包含两种标签噪声,即:1)闭集标签噪声,其中一些示例属于已知类别但被错误标注;2)开集标签噪声,其中一些错误标注示例的真实标签在已知类别之外。然而,现有工作很大程度上忽略了后者,导致学生网络性能有限。因此,本文提出了一种新颖的无数据知识蒸馏范式,通过在通用标签噪声下利用网络收集的数据集,这意味着闭集和开集标签噪声都应得到处理。具体而言,我们首先根据各种数据类型的预测不确定性,将收集到的噪声数据集划分为干净集、闭集噪声集和开集噪声集。对于闭集噪声示例,其标签由教师网络进行细化。同时,在干净集和细化后的闭集噪声集上进行抗噪声混合对比学习,以鼓励学生网络学习教师网络继承的类别和实例知识。对于先前工作未探索的开集噪声示例,我们将它们视为未标注数据并对其进行自监督学习,以丰富学生网络的监督信号。在图像分类任务上的大量实验结果表明,我们的方法能够取得优于现有无数据知识蒸馏方法的性能。

相似文献

1
Learning Student Network Under Universal Label Noise.通用标签噪声下的学习型学生网络
IEEE Trans Image Process. 2024;33:4363-4376. doi: 10.1109/TIP.2024.3430539. Epub 2024 Aug 2.
2
Extended T: Learning With Mixed Closed-Set and Open-Set Noisy Labels.扩展T:使用混合闭集和开集噪声标签进行学习
IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):3047-3058. doi: 10.1109/TPAMI.2022.3180545. Epub 2023 Feb 3.
3
S-CUDA: Self-cleansing unsupervised domain adaptation for medical image segmentation.S-CUDA:用于医学图像分割的自清洁无监督域适应
Med Image Anal. 2021 Dec;74:102214. doi: 10.1016/j.media.2021.102214. Epub 2021 Aug 12.
4
Knowledge Distillation Meets Label Noise Learning: Ambiguity-Guided Mutual Label Refinery.知识蒸馏与标签噪声学习相遇:模糊性引导的相互标签精炼。
IEEE Trans Neural Netw Learn Syst. 2025 Jan;36(1):939-952. doi: 10.1109/TNNLS.2023.3335829. Epub 2025 Jan 7.
5
Complementary label learning based on knowledge distillation.基于知识蒸馏的互补标签学习。
Math Biosci Eng. 2023 Sep 19;20(10):17905-17918. doi: 10.3934/mbe.2023796.
6
Sample self-selection using dual teacher networks for pathological image classification with noisy labels.使用双教师网络进行带噪标签的病理图像分类的样本自选择。
Comput Biol Med. 2024 May;174:108489. doi: 10.1016/j.compbiomed.2024.108489. Epub 2024 Apr 16.
7
Breast tumor classification through learning from noisy labeled ultrasound images.基于带噪标注超声图像的乳腺肿瘤分类。
Med Phys. 2020 Mar;47(3):1048-1057. doi: 10.1002/mp.13966. Epub 2019 Dec 30.
8
Local contrastive loss with pseudo-label based self-training for semi-supervised medical image segmentation.基于伪标签自训练的局部对比损失的半监督医学图像分割。
Med Image Anal. 2023 Jul;87:102792. doi: 10.1016/j.media.2023.102792. Epub 2023 Mar 11.
9
A Parametrical Model for Instance-Dependent Label Noise.一种针对实例相关标签噪声的参数模型。
IEEE Trans Pattern Anal Mach Intell. 2023 Dec;45(12):14055-14068. doi: 10.1109/TPAMI.2023.3301876. Epub 2023 Nov 3.
10
BadLabel: A Robust Perspective on Evaluating and Enhancing Label-Noise Learning.不良标签:关于评估和增强标签噪声学习的稳健视角
IEEE Trans Pattern Anal Mach Intell. 2024 Jun;46(6):4398-4409. doi: 10.1109/TPAMI.2024.3355425. Epub 2024 May 7.