• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于知识蒸馏的轻量级电子元件检测方法研究

Research on a lightweight electronic component detection method based on knowledge distillation.

作者信息

Xia Zilin, Gu Jinan, Wang Wenbo, Huang Zedong

机构信息

School of Mechanical Engineering, Jiangsu University, Zhenjiang 212013, China.

出版信息

Math Biosci Eng. 2023 Nov 22;20(12):20971-20994. doi: 10.3934/mbe.2023928.

DOI:10.3934/mbe.2023928
PMID:38124584
Abstract

As an essential part of electronic component assembly, it is crucial to rapidly and accurately detect electronic components. Therefore, a lightweight electronic component detection method based on knowledge distillation is proposed in this study. First, a lightweight student model was constructed. Then, we consider issues like the teacher and student's differing expressions. A knowledge distillation method based on the combination of feature and channel is proposed to learn the teacher's rich class-related and inter-class difference features. Finally, comparative experiments were analyzed for the dataset. The results show that the student model Params (13.32 M) are reduced by 55%, and FLOPs (28.7 GMac) are reduced by 35% compared to the teacher model. The knowledge distillation method based on the combination of feature and channel improves the student model's mAP by 3.91% and 1.13% on the Pascal VOC and electronic components detection datasets, respectively. As a result of the knowledge distillation, the constructed student model strikes a superior balance between model precision and complexity, allowing for fast and accurate detection of electronic components with a detection precision (mAP) of 97.81% and a speed of 79 FPS.

摘要

作为电子元件装配的重要组成部分,快速准确地检测电子元件至关重要。因此,本研究提出了一种基于知识蒸馏的轻量化电子元件检测方法。首先,构建了一个轻量化的学生模型。然后,考虑教师和学生表达不同等问题。提出了一种基于特征和通道相结合的知识蒸馏方法,以学习教师丰富的类相关和类间差异特征。最后,对数据集进行了对比实验分析。结果表明,与教师模型相比,学生模型的参数(13.32M)减少了55%,浮点运算次数(28.7GMac)减少了35%。基于特征和通道相结合的知识蒸馏方法在Pascal VOC和电子元件检测数据集上分别将学生模型的平均精度均值(mAP)提高了3.91%和1.13%。经过知识蒸馏,构建的学生模型在模型精度和复杂度之间达到了更好的平衡,能够以97.81%的检测精度(mAP)和79帧每秒的速度快速准确地检测电子元件。

相似文献

1
Research on a lightweight electronic component detection method based on knowledge distillation.基于知识蒸馏的轻量级电子元件检测方法研究
Math Biosci Eng. 2023 Nov 22;20(12):20971-20994. doi: 10.3934/mbe.2023928.
2
Learning lightweight tea detector with reconstructed feature and dual distillation.基于重建特征和双重蒸馏的轻量化茶检测器学习。
Sci Rep. 2024 Oct 10;14(1):23669. doi: 10.1038/s41598-024-73674-4.
3
Lightweight model-based sheep face recognition via face image recording channel.基于轻量化模型的绵羊面部识别技术:通过面部图像记录通道。
J Anim Sci. 2024 Jan 3;102. doi: 10.1093/jas/skae066.
4
Cosine similarity-guided knowledge distillation for robust object detectors.用于鲁棒目标检测器的余弦相似度引导的知识蒸馏
Sci Rep. 2024 Aug 14;14(1):18888. doi: 10.1038/s41598-024-69813-6.
5
Structured Knowledge Distillation for Accurate and Efficient Object Detection.用于精确高效目标检测的结构化知识蒸馏
IEEE Trans Pattern Anal Mach Intell. 2023 Dec;45(12):15706-15724. doi: 10.1109/TPAMI.2023.3300470. Epub 2023 Nov 3.
6
Mitigating carbon footprint for knowledge distillation based deep learning model compression.减轻基于知识蒸馏的深度学习模型压缩的碳足迹。
PLoS One. 2023 May 15;18(5):e0285668. doi: 10.1371/journal.pone.0285668. eCollection 2023.
7
Inferior and Coordinate Distillation for Object Detectors.对象检测器的下推与坐标蒸馏。
Sensors (Basel). 2022 Jul 30;22(15):5719. doi: 10.3390/s22155719.
8
Restructuring the Teacher and Student in Self-Distillation.在自蒸馏中重构教师与学生
IEEE Trans Image Process. 2024;33:5551-5563. doi: 10.1109/TIP.2024.3463421. Epub 2024 Oct 4.
9
Expanding and Refining Hybrid Compressors for Efficient Object Re-Identification.
IEEE Trans Image Process. 2024;33:3793-3808. doi: 10.1109/TIP.2024.3410684. Epub 2024 Jun 19.
10
FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.FCKDNet:一种用于语义分割的特征压缩知识蒸馏网络。
Entropy (Basel). 2023 Jan 7;25(1):125. doi: 10.3390/e25010125.