• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于目标检测的局部蒸馏

Localization Distillation for Object Detection.

作者信息

Zheng Zhaohui, Ye Rongguang, Hou Qibin, Ren Dongwei, Wang Ping, Zuo Wangmeng, Cheng Ming-Ming

出版信息

IEEE Trans Pattern Anal Mach Intell. 2023 Aug;45(8):10070-10083. doi: 10.1109/TPAMI.2023.3248583. Epub 2023 Jun 30.

DOI:10.1109/TPAMI.2023.3248583
PMID:37027640
Abstract

Previous knowledge distillation (KD) methods for object detection mostly focus on feature imitation instead of mimicking the prediction logits due to its inefficiency in distilling the localization information. In this paper, we investigate whether logit mimicking always lags behind feature imitation. Towards this goal, we first present a novel localization distillation (LD) method which can efficiently transfer the localization knowledge from the teacher to the student. Second, we introduce the concept of valuable localization region that can aid to selectively distill the classification and localization knowledge for a certain region. Combining these two new components, for the first time, we show that logit mimicking can outperform feature imitation and the absence of localization distillation is a critical reason for why logit mimicking under-performs for years. The thorough studies exhibit the great potential of logit mimicking that can significantly alleviate the localization ambiguity, learn robust feature representation, and ease the training difficulty in the early stage. We also provide the theoretical connection between the proposed LD and the classification KD, that they share the equivalent optimization effect. Our distillation scheme is simple as well as effective and can be easily applied to both dense horizontal object detectors and rotated object detectors. Extensive experiments on the MS COCO, PASCAL VOC, and DOTA benchmarks demonstrate that our method can achieve considerable AP improvement without any sacrifice on the inference speed. Our source code and pretrained models are publicly available at https://github.com/HikariTJU/LD.

摘要

先前用于目标检测的知识蒸馏(KD)方法大多专注于特征模仿,而非模仿预测逻辑,因为在蒸馏定位信息方面效率低下。在本文中,我们研究了逻辑模仿是否总是落后于特征模仿。为实现这一目标,我们首先提出了一种新颖的定位蒸馏(LD)方法,该方法能够有效地将定位知识从教师模型传递给学生模型。其次,我们引入了有价值定位区域的概念,这有助于为特定区域选择性地蒸馏分类和定位知识。结合这两个新组件,我们首次表明逻辑模仿可以超越特征模仿,并且缺乏定位蒸馏是逻辑模仿多年来表现不佳的关键原因。深入研究表明,逻辑模仿具有巨大潜力,能够显著减轻定位模糊性、学习鲁棒的特征表示并缓解早期训练难度。我们还提供了所提出的LD与分类KD之间的理论联系,即它们具有等效的优化效果。我们的蒸馏方案既简单又有效,并且可以轻松应用于密集水平目标检测器和旋转目标检测器。在MS COCO、PASCAL VOC和DOTA基准上进行的大量实验表明,我们的方法可以在不牺牲推理速度的情况下实现显著的AP提升。我们的源代码和预训练模型可在https://github.com/HikariTJU/LD上公开获取。

相似文献

1
Localization Distillation for Object Detection.用于目标检测的局部蒸馏
IEEE Trans Pattern Anal Mach Intell. 2023 Aug;45(8):10070-10083. doi: 10.1109/TPAMI.2023.3248583. Epub 2023 Jun 30.
2
Cosine similarity-guided knowledge distillation for robust object detectors.用于鲁棒目标检测器的余弦相似度引导的知识蒸馏
Sci Rep. 2024 Aug 14;14(1):18888. doi: 10.1038/s41598-024-69813-6.
3
Relation Knowledge Distillation by Auxiliary Learning for Object Detection.用于目标检测的基于辅助学习的关系知识蒸馏
IEEE Trans Image Process. 2024;33:4796-4810. doi: 10.1109/TIP.2024.3445740. Epub 2024 Aug 30.
4
Decoupled graph knowledge distillation: A general logits-based method for learning MLPs on graphs.解耦图知识蒸馏:一种基于对数的在图上学习 MLP 的通用方法。
Neural Netw. 2024 Nov;179:106567. doi: 10.1016/j.neunet.2024.106567. Epub 2024 Jul 23.
5
Distilling Knowledge by Mimicking Features.特征模仿蒸馏知识。
IEEE Trans Pattern Anal Mach Intell. 2022 Nov;44(11):8183-8195. doi: 10.1109/TPAMI.2021.3103973. Epub 2022 Oct 4.
6
Pixel Distillation: Cost-Flexible Distillation Across Image Sizes and Heterogeneous Networks.像素蒸馏:跨图像尺寸和异构网络的成本灵活蒸馏
IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):9536-9550. doi: 10.1109/TPAMI.2024.3421277. Epub 2024 Nov 6.
7
Inferior and Coordinate Distillation for Object Detectors.对象检测器的下推与坐标蒸馏。
Sensors (Basel). 2022 Jul 30;22(15):5719. doi: 10.3390/s22155719.
8
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution.使用分层自监督增强分布的知识蒸馏
IEEE Trans Neural Netw Learn Syst. 2024 Feb;35(2):2094-2108. doi: 10.1109/TNNLS.2022.3186807. Epub 2024 Feb 5.
9
Structured Knowledge Distillation for Accurate and Efficient Object Detection.用于精确高效目标检测的结构化知识蒸馏
IEEE Trans Pattern Anal Mach Intell. 2023 Dec;45(12):15706-15724. doi: 10.1109/TPAMI.2023.3300470. Epub 2023 Nov 3.
10
Hierarchical Regression and Classification for Accurate Object Detection.用于精确目标检测的分层回归与分类
IEEE Trans Neural Netw Learn Syst. 2023 May;34(5):2425-2439. doi: 10.1109/TNNLS.2021.3106641. Epub 2023 May 2.

引用本文的文献

1
CPPE-5: Medical Personal Protective Equipment Dataset.CPPE - 5:医用个人防护装备数据集。
SN Comput Sci. 2023;4(3):263. doi: 10.1007/s42979-023-01748-7. Epub 2023 Mar 16.