• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于重加权交替方向乘子法的 DNN 权值剪枝。

Reweighted Alternating Direction Method of Multipliers for DNN weight pruning.

机构信息

MIIT Key Laboratory of Dynamics and Control of Complex Systems, Xi'an 710072, China; School of Mathematics and Statistics, Northwestern Polytechnical University, Xi'an 710072, China.

MIIT Key Laboratory of Dynamics and Control of Complex Systems, Xi'an 710072, China; School of Mechanics, Civil Engineering and Architecture, Northwestern Polytechnical University, Xi'an 710072, China.

出版信息

Neural Netw. 2024 Nov;179:106534. doi: 10.1016/j.neunet.2024.106534. Epub 2024 Jul 14.

DOI:10.1016/j.neunet.2024.106534
PMID:39059046
Abstract

As Deep Neural Networks (DNNs) continue to grow in complexity and size, leading to a substantial computational burden, weight pruning techniques have emerged as an effective solution. This paper presents a novel method for dynamic regularization-based pruning, which incorporates the Alternating Direction Method of Multipliers (ADMM). Unlike conventional methods that employ simple and abrupt threshold processing, the proposed method introduces a reweighting mechanism to assign importance to the weights in DNNs. Compared to other ADMM-based methods, the new method not only achieves higher accuracy but also saves considerable time thanks to the reduced number of necessary hyperparameters. The method is evaluated on multiple architectures, including LeNet-5, ResNet-32, ResNet-56, and ResNet-50, using the MNIST, CIFAR-10, and ImageNet datasets, respectively. Experimental results demonstrate its superior performance in terms of compression ratios and accuracy compared to state-of-the-art pruning methods. In particular, on the LeNet-5 model for the MNIST dataset, it achieves compression ratios of 355.9× with a slight improvement in accuracy; on the ResNet-50 model trained with the ImageNet dataset, it achieves compression ratios of 4.24× without sacrificing accuracy.

摘要

随着深度神经网络 (DNN) 的复杂性和规模不断扩大,导致计算负担很大,权重剪枝技术已经成为一种有效的解决方案。本文提出了一种新的基于动态正则化的剪枝方法,该方法结合了交替方向乘子法 (ADMM)。与传统的采用简单和突然的阈值处理的方法不同,所提出的方法引入了一种重新加权机制,为 DNN 中的权重分配重要性。与其他基于 ADMM 的方法相比,该新方法不仅实现了更高的准确性,而且由于所需超参数数量的减少,还节省了相当多的时间。该方法在多个架构上进行了评估,包括 LeNet-5、ResNet-32、ResNet-56 和 ResNet-50,分别使用 MNIST、CIFAR-10 和 ImageNet 数据集。实验结果表明,与最先进的剪枝方法相比,该方法在压缩比和准确性方面具有优越的性能。特别是在 MNIST 数据集上的 LeNet-5 模型上,它实现了 355.9×的压缩比,同时准确性略有提高;在使用 ImageNet 数据集训练的 ResNet-50 模型上,它实现了 4.24×的压缩比,而不牺牲准确性。

相似文献

1
Reweighted Alternating Direction Method of Multipliers for DNN weight pruning.基于重加权交替方向乘子法的 DNN 权值剪枝。
Neural Netw. 2024 Nov;179:106534. doi: 10.1016/j.neunet.2024.106534. Epub 2024 Jul 14.
2
StructADMM: Achieving Ultrahigh Efficiency in Structured Pruning for DNNs.结构化交替方向乘子法(StructADMM):在深度神经网络的结构化剪枝中实现超高效率
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):2259-2273. doi: 10.1109/TNNLS.2020.3045153. Epub 2022 May 2.
3
HRel: Filter pruning based on High Relevance between activation maps and class labels.HRel:基于激活图与类别标签之间的高相关性的滤波器修剪。
Neural Netw. 2022 Mar;147:186-197. doi: 10.1016/j.neunet.2021.12.017. Epub 2021 Dec 30.
4
Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning.通过结构稀疏正则化滤波器剪枝实现紧凑卷积神经网络
IEEE Trans Neural Netw Learn Syst. 2020 Feb;31(2):574-588. doi: 10.1109/TNNLS.2019.2906563. Epub 2019 Apr 12.
5
Feature flow regularization: Improving structured sparsity in deep neural networks.特征流正则化:改善深度神经网络中的结构化稀疏性。
Neural Netw. 2023 Apr;161:598-613. doi: 10.1016/j.neunet.2023.02.013. Epub 2023 Feb 13.
6
Non-Structured DNN Weight Pruning-Is It Beneficial in Any Platform?非结构化深度神经网络权重剪枝——在任何平台上都有益吗?
IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4930-4944. doi: 10.1109/TNNLS.2021.3063265. Epub 2022 Aug 31.
7
Redundant feature pruning for accelerated inference in deep neural networks.冗余特征剪枝在深度神经网络中的加速推理。
Neural Netw. 2019 Oct;118:148-158. doi: 10.1016/j.neunet.2019.04.021. Epub 2019 May 9.
8
Weak sub-network pruning for strong and efficient neural networks.弱子网络剪枝技术:构建强大而高效的神经网络
Neural Netw. 2021 Dec;144:614-626. doi: 10.1016/j.neunet.2021.09.015. Epub 2021 Sep 30.
9
Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks.通过条件精度变化对深度神经网络进行动态通道剪枝。
IEEE Trans Neural Netw Learn Syst. 2021 Feb;32(2):799-813. doi: 10.1109/TNNLS.2020.2979517. Epub 2021 Feb 4.
10
CRESPR: Modular sparsification of DNNs to improve pruning performance and model interpretability.CRESPR:用于提高剪枝性能和模型可解释性的 DNN 模块化稀疏化。
Neural Netw. 2024 Apr;172:106067. doi: 10.1016/j.neunet.2023.12.021. Epub 2023 Dec 17.