• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

动态概率剪枝:一种用于不同粒度硬件约束剪枝的通用框架。

Dynamic Probabilistic Pruning: A General Framework for Hardware-Constrained Pruning at Different Granularities.

作者信息

Gonzalez-Carabarin Lizeth, Huijben Iris A M, Veeling Bastian, Schmid Alexandre, van Sloun Ruud J G

出版信息

IEEE Trans Neural Netw Learn Syst. 2022 Jun 8;PP. doi: 10.1109/TNNLS.2022.3176809.

DOI:10.1109/TNNLS.2022.3176809
PMID:35675247
Abstract

Unstructured neural network pruning algorithms have achieved impressive compression ratios. However, the resulting-typically irregular-sparse matrices hamper efficient hardware implementations, leading to additional memory usage and complex control logic that diminishes the benefits of unstructured pruning. This has spurred structured coarse-grained pruning solutions that prune entire feature maps or even layers, enabling efficient implementation at the expense of reduced flexibility. Here, we propose a flexible new pruning mechanism that facilitates pruning at different granularities (weights, kernels, and feature maps) while retaining efficient memory organization (e.g., pruning exactly k -out-of- n weights for every output neuron or pruning exactly k -out-of- n kernels for every feature map). We refer to this algorithm as dynamic probabilistic pruning (DPP). DPP leverages the Gumbel-softmax relaxation for differentiable k -out-of- n sampling, facilitating end-to-end optimization. We show that DPP achieves competitive compression ratios and classification accuracy when pruning common deep learning models trained on different benchmark datasets for image classification. Relevantly, the dynamic masking of DPP facilitates for joint optimization of pruning and weight quantization in order to even further compress the network, which we show as well. Finally, we propose novel information-theoretic metrics that show the confidence and pruning diversity of pruning masks within a layer.

摘要

非结构化神经网络剪枝算法已实现了令人瞩目的压缩率。然而,由此产生的(通常是不规则的)稀疏矩阵妨碍了高效的硬件实现,导致额外的内存使用和复杂的控制逻辑,从而削弱了非结构化剪枝的优势。这促使了结构化粗粒度剪枝解决方案的出现,该方案会剪枝整个特征图甚至层,以牺牲灵活性为代价实现高效实现。在此,我们提出一种灵活的新剪枝机制,它有助于在不同粒度(权重、内核和特征图)上进行剪枝,同时保持高效的内存组织(例如,为每个输出神经元精确地从n个权重中剪枝k个,或者为每个特征图精确地从n个内核中剪枝k个)。我们将此算法称为动态概率剪枝(DPP)。DPP利用Gumbel-softmax松弛进行可微的k选n采样,便于端到端优化。我们表明,在对不同基准数据集上训练的用于图像分类的常见深度学习模型进行剪枝时,DPP实现了具有竞争力的压缩率和分类准确率。相关地,DPP的动态掩码有助于联合优化剪枝和权重量化,以便进一步压缩网络,我们也展示了这一点。最后,我们提出了新颖的信息论指标,这些指标展示了层内剪枝掩码的置信度和剪枝多样性。

相似文献

1
Dynamic Probabilistic Pruning: A General Framework for Hardware-Constrained Pruning at Different Granularities.动态概率剪枝:一种用于不同粒度硬件约束剪枝的通用框架。
IEEE Trans Neural Netw Learn Syst. 2022 Jun 8;PP. doi: 10.1109/TNNLS.2022.3176809.
2
Non-Structured DNN Weight Pruning-Is It Beneficial in Any Platform?非结构化深度神经网络权重剪枝——在任何平台上都有益吗?
IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4930-4944. doi: 10.1109/TNNLS.2021.3063265. Epub 2022 Aug 31.
3
Coarse-Grained Pruning of Neural Network Models Based on Blocky Sparse Structure.基于块状稀疏结构的神经网络模型粗粒度剪枝
Entropy (Basel). 2021 Aug 13;23(8):1042. doi: 10.3390/e23081042.
4
Weak sub-network pruning for strong and efficient neural networks.弱子网络剪枝技术:构建强大而高效的神经网络
Neural Netw. 2021 Dec;144:614-626. doi: 10.1016/j.neunet.2021.09.015. Epub 2021 Sep 30.
5
Discrimination-Aware Network Pruning for Deep Model Compression.面向深度模型压缩的歧视感知网络剪枝。
IEEE Trans Pattern Anal Mach Intell. 2022 Aug;44(8):4035-4051. doi: 10.1109/TPAMI.2021.3066410. Epub 2022 Jul 1.
6
Ps and Qs: Quantization-Aware Pruning for Efficient Low Latency Neural Network Inference.Ps和Qs:用于高效低延迟神经网络推理的量化感知剪枝
Front Artif Intell. 2021 Jul 9;4:676564. doi: 10.3389/frai.2021.676564. eCollection 2021.
7
Dynamical Conventional Neural Network Channel Pruning by Genetic Wavelet Channel Search for Image Classification.基于遗传小波通道搜索的动态传统神经网络通道剪枝用于图像分类
Front Comput Neurosci. 2021 Oct 27;15:760554. doi: 10.3389/fncom.2021.760554. eCollection 2021.
8
Multi-objective evolutionary optimization for hardware-aware neural network pruning.用于硬件感知神经网络剪枝的多目标进化优化
Fundam Res. 2022 Aug 9;4(4):941-950. doi: 10.1016/j.fmre.2022.07.013. eCollection 2024 Jul.
9
PSE-Net: Channel pruning for Convolutional Neural Networks with parallel-subnets estimator.PSE-Net:基于并行子网估计器的卷积神经网络通道剪枝
Neural Netw. 2024 Jun;174:106263. doi: 10.1016/j.neunet.2024.106263. Epub 2024 Mar 20.
10
EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks.EvoPruneDeepTL:一种用于基于迁移学习的深度神经网络的进化剪枝模型。
Neural Netw. 2023 Jan;158:59-82. doi: 10.1016/j.neunet.2022.10.011. Epub 2022 Nov 4.