• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

交叉熵剪枝用于压缩卷积神经网络。

Cross-Entropy Pruning for Compressing Convolutional Neural Networks.

机构信息

School of Software, Dalian University of Technology, Dalian, Liaoning, China

出版信息

Neural Comput. 2018 Nov;30(11):3128-3149. doi: 10.1162/neco_a_01131. Epub 2018 Sep 14.

DOI:10.1162/neco_a_01131
PMID:30216142
Abstract

The success of CNNs is accompanied by deep models and heavy storage costs. For compressing CNNs, we propose an efficient and robust pruning approach, cross-entropy pruning (CEP). Given a trained CNN model, connections were divided into groups in a group-wise way according to their corresponding output neurons. All connections with their cross-entropy errors below a grouping threshold were then removed. A sparse model was obtained and the number of parameters in the baseline model significantly reduced. This letter also presents a highest cross-entropy pruning (HCEP) method that keeps a small portion of weights with the highest CEP. This method further improves the accuracy of CEP. To validate CEP, we conducted the experiments on low redundant networks that are hard to compress. For the MNIST data set, CEP achieves an 0.08% accuracy drop required by LeNet-5 benchmark with only 16% of original parameters. Our proposed CEP also reduces approximately 75% of the storage cost of AlexNet on the ILSVRC 2012 data set, increasing the top-1 errorby only 0.4% and top-5 error by only 0.2%. Compared with three existing methods on LeNet-5, our proposed CEP and HCEP perform significantly better than the existing methods in terms of the accuracy and stability. Some computer vision tasks on CNNs such as object detection and style transfer can be computed in a high-performance way using our CEP and HCEP strategies.

摘要

卷积神经网络的成功伴随着深层模型和大量的存储成本。为了压缩卷积神经网络,我们提出了一种高效而鲁棒的剪枝方法,即交叉熵剪枝(CEP)。对于已训练的卷积神经网络模型,根据其对应的输出神经元,以组的方式将连接分组。然后,删除所有交叉熵误差低于分组阈值的连接。得到一个稀疏模型,并显著减少基线模型中的参数数量。本函还提出了一种最高交叉熵剪枝(HCEP)方法,该方法保留了一小部分具有最高 CEP 的权重。这种方法进一步提高了 CEP 的准确性。为了验证 CEP,我们在难以压缩的低冗余网络上进行了实验。对于 MNIST 数据集,CEP 在 LeNet-5 基准上实现了 0.08%的准确率下降,只需原始参数的 16%。我们提出的 CEP 还减少了 ILSVRC 2012 数据集上 AlexNet 存储成本的约 75%,仅增加了 0.4%的 top-1 误差和 0.2%的 top-5 误差。与 LeNet-5 上的三种现有方法相比,我们提出的 CEP 和 HCEP 在准确性和稳定性方面明显优于现有方法。使用我们的 CEP 和 HCEP 策略,可以以高性能方式计算卷积神经网络上的一些计算机视觉任务,如目标检测和风格迁移。

相似文献

1
Cross-Entropy Pruning for Compressing Convolutional Neural Networks.交叉熵剪枝用于压缩卷积神经网络。
Neural Comput. 2018 Nov;30(11):3128-3149. doi: 10.1162/neco_a_01131. Epub 2018 Sep 14.
2
HRel: Filter pruning based on High Relevance between activation maps and class labels.HRel:基于激活图与类别标签之间的高相关性的滤波器修剪。
Neural Netw. 2022 Mar;147:186-197. doi: 10.1016/j.neunet.2021.12.017. Epub 2021 Dec 30.
3
Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks.渐进式软滤波器剪枝在深度卷积神经网络中的应用。
IEEE Trans Cybern. 2020 Aug;50(8):3594-3604. doi: 10.1109/TCYB.2019.2933477. Epub 2019 Aug 27.
4
Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning.通过结构稀疏正则化滤波器剪枝实现紧凑卷积神经网络
IEEE Trans Neural Netw Learn Syst. 2020 Feb;31(2):574-588. doi: 10.1109/TNNLS.2019.2906563. Epub 2019 Apr 12.
5
Differential Evolution Based Layer-Wise Weight Pruning for Compressing Deep Neural Networks.基于差分进化的逐层权值剪枝压缩深度神经网络。
Sensors (Basel). 2021 Jan 28;21(3):880. doi: 10.3390/s21030880.
6
Hierarchical Pruning for Simplification of Convolutional Neural Networks in Diabetic Retinopathy Classification.用于糖尿病视网膜病变分类中简化卷积神经网络的分层剪枝
Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:970-973. doi: 10.1109/EMBC.2019.8857769.
7
Evolutionary Multi-Objective One-Shot Filter Pruning for Designing Lightweight Convolutional Neural Network.进化多目标单拍滤波器剪枝用于设计轻量级卷积神经网络。
Sensors (Basel). 2021 Sep 2;21(17):5901. doi: 10.3390/s21175901.
8
CNNPruner: Pruning Convolutional Neural Networks with Visual Analytics.CNNPruner:通过可视化分析修剪卷积神经网络
IEEE Trans Vis Comput Graph. 2021 Feb;27(2):1364-1373. doi: 10.1109/TVCG.2020.3030461. Epub 2021 Jan 28.
9
Cross-layer importance evaluation for neural network pruning.神经网络剪枝的跨层重要性评估。
Neural Netw. 2024 Nov;179:106496. doi: 10.1016/j.neunet.2024.106496. Epub 2024 Jul 3.
10
Dynamically Optimizing Network Structure Based on Synaptic Pruning in the Brain.基于大脑突触修剪动态优化网络结构
Front Syst Neurosci. 2021 Jun 4;15:620558. doi: 10.3389/fnsys.2021.620558. eCollection 2021.

引用本文的文献

1
Number of necessary training examples for Neural Networks with different number of trainable parameters.具有不同可训练参数数量的神经网络所需训练示例的数量。
J Pathol Inform. 2022 Jul 6;13:100114. doi: 10.1016/j.jpi.2022.100114. eCollection 2022.