• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

CRESPR:用于提高剪枝性能和模型可解释性的 DNN 模块化稀疏化。

CRESPR: Modular sparsification of DNNs to improve pruning performance and model interpretability.

机构信息

University of Massachusetts Boston, United States of America.

University of Massachusetts Boston, United States of America.

出版信息

Neural Netw. 2024 Apr;172:106067. doi: 10.1016/j.neunet.2023.12.021. Epub 2023 Dec 17.

DOI:10.1016/j.neunet.2023.12.021
PMID:38199151
Abstract

Modern DNNs often include a huge number of parameters that are expensive for both computation and memory. Pruning can significantly reduce model complexity and lessen resource demands, and less complex models can also be easier to explain and interpret. In this paper, we propose a novel pruning algorithm, Cluster-Restricted Extreme Sparsity Pruning of Redundancy (CRESPR), to prune a neural network into modular units and achieve better pruning efficiency. With the Hessian matrix, we provide an analytic explanation of why modular structures in a sparse DNN can better maintain performance, especially at an extreme high pruning ratio. In CRESPR, each modular unit contains mostly internal connections, which clearly shows how subgroups of input features are processed through a DNN and eventually contribute to classification decisions. Such process-level revealing of internal working mechanisms undoubtedly leads to better interpretability of a black-box DNN model. Extensive experiments were conducted with multiple DNN architectures and datasets, and CRESPR achieves higher pruning performance than current state-of-the-art methods at high and extremely high pruning ratios. Additionally, we show how CRESPR improves model interpretability through a concrete example.

摘要

现代 DNN 通常包含大量的参数,这些参数在计算和内存方面都非常昂贵。剪枝可以显著降低模型的复杂度和资源需求,并且更简单的模型也更容易解释和理解。在本文中,我们提出了一种新的剪枝算法,即聚类约束的冗余极端稀疏剪枝(CRESPR),将神经网络剪枝为模块化单元,以实现更好的剪枝效率。通过 Hessian 矩阵,我们提供了一个解析解释,说明为什么稀疏 DNN 中的模块化结构可以更好地保持性能,特别是在极高的剪枝率下。在 CRESPR 中,每个模块化单元主要包含内部连接,这清楚地展示了输入特征的子组如何通过 DNN 进行处理,并最终有助于分类决策。这种对内部工作机制的过程级揭示无疑会提高黑盒 DNN 模型的可解释性。我们使用多种 DNN 架构和数据集进行了广泛的实验,CRESPR 在高和极高剪枝率下的剪枝性能优于当前最先进的方法。此外,我们还通过一个具体的例子展示了 CRESPR 如何提高模型的可解释性。

相似文献

1
CRESPR: Modular sparsification of DNNs to improve pruning performance and model interpretability.CRESPR:用于提高剪枝性能和模型可解释性的 DNN 模块化稀疏化。
Neural Netw. 2024 Apr;172:106067. doi: 10.1016/j.neunet.2023.12.021. Epub 2023 Dec 17.
2
SSGD: SPARSITY-PROMOTING STOCHASTIC GRADIENT DESCENT ALGORITHM FOR UNBIASED DNN PRUNING.SSGD:用于无偏深度神经网络剪枝的稀疏性促进随机梯度下降算法
Proc IEEE Int Conf Acoust Speech Signal Process. 2020 May;2020:5410-5414. doi: 10.1109/icassp40776.2020.9054436. Epub 2020 May 14.
3
Reweighted Alternating Direction Method of Multipliers for DNN weight pruning.基于重加权交替方向乘子法的 DNN 权值剪枝。
Neural Netw. 2024 Nov;179:106534. doi: 10.1016/j.neunet.2024.106534. Epub 2024 Jul 14.
4
Jump-GRS: a multi-phase approach to structured pruning of neural networks for neural decoding.Jump-GRS:一种用于神经解码的神经网络结构化剪枝的多阶段方法。
J Neural Eng. 2023 Jul 31;20(4). doi: 10.1088/1741-2552/ace5dc.
5
StructADMM: Achieving Ultrahigh Efficiency in Structured Pruning for DNNs.结构化交替方向乘子法(StructADMM):在深度神经网络的结构化剪枝中实现超高效率
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):2259-2273. doi: 10.1109/TNNLS.2020.3045153. Epub 2022 May 2.
6
Feature flow regularization: Improving structured sparsity in deep neural networks.特征流正则化:改善深度神经网络中的结构化稀疏性。
Neural Netw. 2023 Apr;161:598-613. doi: 10.1016/j.neunet.2023.02.013. Epub 2023 Feb 13.
7
GRIM: A General, Real-Time Deep Learning Inference Framework for Mobile Devices Based on Fine-Grained Structured Weight Sparsity.GRIM:一种基于细粒度结构化权重稀疏化的用于移动设备的通用、实时深度学习推理框架。
IEEE Trans Pattern Anal Mach Intell. 2022 Oct;44(10):6224-6239. doi: 10.1109/TPAMI.2021.3089687. Epub 2022 Sep 14.
8
Pruning deep neural networks generates a sparse, bio-inspired nonlinear controller for insect flight.修剪深度神经网络生成了一种稀疏的、受生物启发的非线性昆虫飞行控制器。
PLoS Comput Biol. 2022 Sep 27;18(9):e1010512. doi: 10.1371/journal.pcbi.1010512. eCollection 2022 Sep.
9
Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks.渐进式软滤波器剪枝在深度卷积神经网络中的应用。
IEEE Trans Cybern. 2020 Aug;50(8):3594-3604. doi: 10.1109/TCYB.2019.2933477. Epub 2019 Aug 27.
10
Dynamic Image Difficulty-Aware DNN Pruning.动态图像难度感知深度神经网络剪枝
Micromachines (Basel). 2023 Apr 23;14(5):908. doi: 10.3390/mi14050908.