• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于信息容量和独立性的滤波器剪枝

Filter Pruning Based on Information Capacity and Independence.

作者信息

Tang Xiaolong, Ye Shuo, Shi Yufeng, Hu Tianheng, Peng Qinmu, You Xinge

出版信息

IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):8401-8413. doi: 10.1109/TNNLS.2024.3415068. Epub 2025 May 2.

DOI:10.1109/TNNLS.2024.3415068
PMID:39231052
Abstract

Filter pruning has gained widespread adoption for the purpose of compressing and speeding up convolutional neural networks (CNNs). However, the existing approaches are still far from practical applications due to biased filter selection and heavy computation cost. This article introduces a new filter pruning method that selects filters in an interpretable, multiperspective, and lightweight manner. Specifically, we evaluate the contributions of filters from both individual and overall perspectives. For the amount of information contained in each filter, a new metric called information capacity is proposed. Inspired by the information theory, we utilize the interpretable entropy to measure the information capacity and develop a feature-guided approximation process. For correlations among filters, another metric called information independence is designed. Since the aforementioned metrics are evaluated in a simple but effective way, we can identify and prune the least important filters with less computation cost. We conduct comprehensive experiments on benchmark datasets employing various widely used CNN architectures to evaluate the performance of our method. For instance, on ILSVRC-2012, our method outperforms state-of-the-art methods by reducing floating-point operations (FLOPs) by 77.4% and parameters by 69.3% for ResNet-50 with only a minor decrease in an accuracy of 2.64%.

摘要

为了压缩和加速卷积神经网络(CNN),滤波器剪枝已得到广泛应用。然而,由于滤波器选择存在偏差和计算成本高昂,现有方法仍远未达到实际应用的要求。本文介绍了一种新的滤波器剪枝方法,该方法以可解释、多视角和轻量级的方式选择滤波器。具体而言,我们从个体和整体两个角度评估滤波器的贡献。对于每个滤波器所包含的信息量,我们提出了一种名为信息容量的新指标。受信息论的启发,我们利用可解释的熵来度量信息容量,并开发了一种特征引导的近似过程。对于滤波器之间的相关性,我们设计了另一个名为信息独立性的指标。由于上述指标的评估方式简单而有效,我们能够以较低的计算成本识别和剪枝最不重要的滤波器。我们在使用各种广泛应用的CNN架构的基准数据集上进行了全面实验,以评估我们方法的性能。例如,在ILSVRC - 2012数据集上,对于ResNet - 50,我们的方法将浮点运算(FLOPs)减少了77.4%,参数减少了69.3%,同时仅使准确率轻微下降了2.64%,优于当前的先进方法。

相似文献

1
Filter Pruning Based on Information Capacity and Independence.基于信息容量和独立性的滤波器剪枝
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):8401-8413. doi: 10.1109/TNNLS.2024.3415068. Epub 2025 May 2.
2
HRel: Filter pruning based on High Relevance between activation maps and class labels.HRel:基于激活图与类别标签之间的高相关性的滤波器修剪。
Neural Netw. 2022 Mar;147:186-197. doi: 10.1016/j.neunet.2021.12.017. Epub 2021 Dec 30.
3
Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks.渐进式软滤波器剪枝在深度卷积神经网络中的应用。
IEEE Trans Cybern. 2020 Aug;50(8):3594-3604. doi: 10.1109/TCYB.2019.2933477. Epub 2019 Aug 27.
4
Model pruning based on filter similarity for edge device deployment.基于滤波器相似度的模型剪枝用于边缘设备部署。
Front Neurorobot. 2023 Mar 2;17:1132679. doi: 10.3389/fnbot.2023.1132679. eCollection 2023.
5
Filter Pruning by Switching to Neighboring CNNs With Good Attributes.通过切换到具有良好属性的相邻卷积神经网络进行滤波器剪枝。
IEEE Trans Neural Netw Learn Syst. 2023 Oct;34(10):8044-8056. doi: 10.1109/TNNLS.2022.3149332. Epub 2023 Oct 5.
6
SAAF: Self-Adaptive Attention Factor-Based Taylor-Pruning on Convolutional Neural Networks.SAAF:基于自适应注意力因子的卷积神经网络泰勒剪枝法
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):8540-8553. doi: 10.1109/TNNLS.2024.3439435. Epub 2025 May 2.
7
Where to Prune: Using LSTM to Guide Data-Dependent Soft Pruning.修剪位置:使用长短期记忆网络指导数据依赖的软修剪
IEEE Trans Image Process. 2021;30:293-304. doi: 10.1109/TIP.2020.3035028. Epub 2020 Nov 24.
8
Carrying Out CNN Channel Pruning in a White Box.在白盒中进行卷积神经网络(CNN)通道剪枝
IEEE Trans Neural Netw Learn Syst. 2023 Oct;34(10):7946-7955. doi: 10.1109/TNNLS.2022.3147269. Epub 2023 Oct 5.
9
Filter Pruning via Measuring Feature Map Information.通过测量特征图信息进行滤波器剪枝
Sensors (Basel). 2021 Oct 2;21(19):6601. doi: 10.3390/s21196601.
10
Filter Pruning by High-Order Spectral Clustering.基于高阶谱聚类的滤波器剪枝
IEEE Trans Pattern Anal Mach Intell. 2025 Apr;47(4):2402-2415. doi: 10.1109/TPAMI.2024.3524381. Epub 2025 Mar 6.