• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

渐进式软滤波器剪枝在深度卷积神经网络中的应用。

Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks.

出版信息

IEEE Trans Cybern. 2020 Aug;50(8):3594-3604. doi: 10.1109/TCYB.2019.2933477. Epub 2019 Aug 27.

DOI:10.1109/TCYB.2019.2933477
PMID:31478883
Abstract

Deeper and wider convolutional neural networks (CNNs) achieve superior performance but bring expensive computation cost. Accelerating such overparameterized neural network has received increased attention. A typical pruning algorithm is a three-stage pipeline, i.e., training, pruning, and retraining. Prevailing approaches fix the pruned filters to zero during retraining and, thus, significantly reduce the optimization space. Besides, they directly prune a large number of filters at first, which would cause unrecoverable information loss. To solve these problems, we propose an asymptotic soft filter pruning (ASFP) method to accelerate the inference procedure of the deep neural networks. First, we update the pruned filters during the retraining stage. As a result, the optimization space of the pruned model would not be reduced but be the same as that of the original model. In this way, the model has enough capacity to learn from the training data. Second, we prune the network asymptotically. We prune few filters at first and asymptotically prune more filters during the training procedure. With asymptotic pruning, the information of the training set would be gradually concentrated in the remaining filters, so the subsequent training and pruning process would be stable. The experiments show the effectiveness of our ASFP on image classification benchmarks. Notably, on ILSVRC-2012, our ASFP reduces more than 40% FLOPs on ResNet-50 with only 0.14% top-5 accuracy degradation, which is higher than the soft filter pruning by 8%.

摘要

更深和更宽的卷积神经网络(CNNs)实现了卓越的性能,但带来了昂贵的计算成本。加速这种超参数化的神经网络已经引起了越来越多的关注。一种典型的剪枝算法是一个三阶段的流水线,即训练、剪枝和再训练。流行的方法在再训练过程中固定修剪的过滤器为零,从而显著减小了优化空间。此外,它们最初直接修剪大量的过滤器,这会导致不可恢复的信息丢失。为了解决这些问题,我们提出了一种渐近软滤波器剪枝(ASFP)方法来加速深度神经网络的推理过程。首先,我们在再训练阶段更新修剪的过滤器。因此,修剪模型的优化空间不会减小,而是与原始模型的优化空间相同。这样,模型就有足够的能力从训练数据中学习。其次,我们渐近地修剪网络。我们最初修剪少量的过滤器,并且在训练过程中渐近地修剪更多的过滤器。通过渐近剪枝,训练集的信息将逐渐集中在剩余的过滤器中,因此后续的训练和剪枝过程将是稳定的。实验表明,我们的 ASFP 在图像分类基准上是有效的。值得注意的是,在 ILSVRC-2012 上,我们的 ASFP 在 ResNet-50 上减少了 40%以上的 FLOPs,而 Top-5 准确率仅下降了 0.14%,比软滤波器剪枝高 8%。

相似文献

1
Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks.渐进式软滤波器剪枝在深度卷积神经网络中的应用。
IEEE Trans Cybern. 2020 Aug;50(8):3594-3604. doi: 10.1109/TCYB.2019.2933477. Epub 2019 Aug 27.
2
Adding Before Pruning: Sparse Filter Fusion for Deep Convolutional Neural Networks via Auxiliary Attention.剪枝前添加:通过辅助注意力实现深度卷积神经网络的稀疏滤波器融合
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):3930-3942. doi: 10.1109/TNNLS.2021.3106917. Epub 2025 Feb 28.
3
Filter Pruning by Switching to Neighboring CNNs With Good Attributes.通过切换到具有良好属性的相邻卷积神经网络进行滤波器剪枝。
IEEE Trans Neural Netw Learn Syst. 2023 Oct;34(10):8044-8056. doi: 10.1109/TNNLS.2022.3149332. Epub 2023 Oct 5.
4
Model pruning based on filter similarity for edge device deployment.基于滤波器相似度的模型剪枝用于边缘设备部署。
Front Neurorobot. 2023 Mar 2;17:1132679. doi: 10.3389/fnbot.2023.1132679. eCollection 2023.
5
Evolutionary Multi-Objective One-Shot Filter Pruning for Designing Lightweight Convolutional Neural Network.进化多目标单拍滤波器剪枝用于设计轻量级卷积神经网络。
Sensors (Basel). 2021 Sep 2;21(17):5901. doi: 10.3390/s21175901.
6
HRel: Filter pruning based on High Relevance between activation maps and class labels.HRel:基于激活图与类别标签之间的高相关性的滤波器修剪。
Neural Netw. 2022 Mar;147:186-197. doi: 10.1016/j.neunet.2021.12.017. Epub 2021 Dec 30.
7
Redundant feature pruning for accelerated inference in deep neural networks.冗余特征剪枝在深度神经网络中的加速推理。
Neural Netw. 2019 Oct;118:148-158. doi: 10.1016/j.neunet.2019.04.021. Epub 2019 May 9.
8
REAF: Remembering Enhancement and Entropy-Based Asymptotic Forgetting for Filter Pruning.REAF:用于滤波器剪枝的基于记忆增强和信息熵渐近遗忘的方法。
IEEE Trans Image Process. 2023;32:3912-3923. doi: 10.1109/TIP.2023.3288986. Epub 2023 Jul 17.
9
Cross-layer importance evaluation for neural network pruning.神经网络剪枝的跨层重要性评估。
Neural Netw. 2024 Nov;179:106496. doi: 10.1016/j.neunet.2024.106496. Epub 2024 Jul 3.
10
A transfer learning with structured filter pruning approach for improved breast cancer classification on point-of-care devices.基于结构滤波器剪枝的迁移学习方法提高即时检测设备中乳腺癌分类的准确性。
Comput Biol Med. 2021 Jul;134:104432. doi: 10.1016/j.compbiomed.2021.104432. Epub 2021 Apr 30.

引用本文的文献

1
IESSP: Information Extraction-Based Sparse Stripe Pruning Method for Deep Neural Networks.IESSP:基于信息提取的深度神经网络稀疏条纹剪枝方法
Sensors (Basel). 2025 Apr 3;25(7):2261. doi: 10.3390/s25072261.
2
Model compression for real-time object detection using rigorous gradation pruning.使用严格梯度剪枝的实时目标检测模型压缩
iScience. 2024 Dec 17;28(1):111618. doi: 10.1016/j.isci.2024.111618. eCollection 2025 Jan 17.
3
Research on Lightweight Method of Insulator Target Detection Based on Improved SSD.基于改进SSD的绝缘子目标检测轻量化方法研究
Sensors (Basel). 2024 Sep 12;24(18):5910. doi: 10.3390/s24185910.
4
A Comprehensive Review of Hardware Acceleration Techniques and Convolutional Neural Networks for EEG Signals.硬件加速技术与 EEG 信号卷积神经网络的全面综述
Sensors (Basel). 2024 Sep 7;24(17):5813. doi: 10.3390/s24175813.
5
A geometric approach for accelerating neural networks designed for classification problems.一种用于加速针对分类问题设计的神经网络的几何方法。
Sci Rep. 2024 Jul 30;14(1):17590. doi: 10.1038/s41598-024-68172-6.
6
Lightweight Meter Pointer Recognition Method Based on Improved YOLOv5.基于改进YOLOv5的轻量级仪表指针识别方法
Sensors (Basel). 2024 Feb 26;24(5):1507. doi: 10.3390/s24051507.
7
Complex hybrid weighted pruning method for accelerating convolutional neural networks.用于加速卷积神经网络的复杂混合加权剪枝方法
Sci Rep. 2024 Mar 6;14(1):5570. doi: 10.1038/s41598-024-55942-5.
8
A Light Vehicle License-Plate-Recognition System Based on Hybrid Edge-Cloud Computing.一种基于混合边缘云计算的轻型车辆车牌识别系统。
Sensors (Basel). 2023 Nov 2;23(21):8913. doi: 10.3390/s23218913.
9
Random pruning: channel sparsity by expectation scaling factor.随机剪枝:通过期望缩放因子实现通道稀疏性
PeerJ Comput Sci. 2023 Sep 5;9:e1564. doi: 10.7717/peerj-cs.1564. eCollection 2023.
10
Model pruning based on filter similarity for edge device deployment.基于滤波器相似度的模型剪枝用于边缘设备部署。
Front Neurorobot. 2023 Mar 2;17:1132679. doi: 10.3389/fnbot.2023.1132679. eCollection 2023.