• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

IESSP:基于信息提取的深度神经网络稀疏条纹剪枝方法

IESSP: Information Extraction-Based Sparse Stripe Pruning Method for Deep Neural Networks.

作者信息

Liu Jingjing, Huang Lingjin, Feng Manlong, Guo Aiying, Yin Luqiao, Zhang Jianhua

机构信息

Shanghai Key Laboratory of Chips and Systems for Intelligent Connected Vehicle, School of Microelectronics, Shanghai University, Shanghai 200444, China.

出版信息

Sensors (Basel). 2025 Apr 3;25(7):2261. doi: 10.3390/s25072261.

DOI:10.3390/s25072261
PMID:40218773
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11991360/
Abstract

Network pruning is a deep learning model compression technique aimed at reducing model storage requirements and decreasing computational resource consumption. However, mainstream pruning techniques often encounter challenges such as limited precision in feature selection and a diminished feature extraction capability. To address these issues, we propose an information extraction-based sparse stripe pruning (IESSP) method. This method introduces an information extraction module (IEM), which enhances stripe selection through a mask-based mechanism, promoting inter-layer interactions and directing the network's focus toward key features. In addition, we design a novel loss function that links output loss to stripe selection, enabling an effective balance between accuracy and efficiency. This loss function also supports the adaptive optimization of stripe sparsity during training. Experimental results on benchmark datasets demonstrate that the proposed method outperforms existing techniques. Specifically, when applied to prune the VGG-16 model on the CIFAR-10 dataset, the proposed method achieves a 0.29% improvement in accuracy while reducing FLOPs by 75.88% compared to the baseline.

摘要

网络剪枝是一种深度学习模型压缩技术,旨在降低模型存储需求并减少计算资源消耗。然而,主流剪枝技术常常面临诸如特征选择精度有限和特征提取能力下降等挑战。为了解决这些问题,我们提出了一种基于信息提取的稀疏条纹剪枝(IESSP)方法。该方法引入了一个信息提取模块(IEM),它通过基于掩码的机制增强条纹选择,促进层间交互并引导网络关注关键特征。此外,我们设计了一种新颖的损失函数,将输出损失与条纹选择联系起来,从而在准确性和效率之间实现有效平衡。该损失函数还支持在训练期间对条纹稀疏性进行自适应优化。在基准数据集上的实验结果表明,所提出的方法优于现有技术。具体而言,当应用于在CIFAR-10数据集上剪枝VGG-16模型时,与基线相比,所提出的方法在准确率上提高了0.29%,同时将浮点运算次数减少了75.88%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/ad399140b837/sensors-25-02261-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/f24b3be7cd17/sensors-25-02261-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/5c3eb77ecc75/sensors-25-02261-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/3ed962e26ec4/sensors-25-02261-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/9d79942b23fb/sensors-25-02261-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/1fee62560991/sensors-25-02261-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/ad399140b837/sensors-25-02261-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/f24b3be7cd17/sensors-25-02261-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/5c3eb77ecc75/sensors-25-02261-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/3ed962e26ec4/sensors-25-02261-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/9d79942b23fb/sensors-25-02261-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/1fee62560991/sensors-25-02261-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ee47/11991360/ad399140b837/sensors-25-02261-g006.jpg

相似文献

1
IESSP: Information Extraction-Based Sparse Stripe Pruning Method for Deep Neural Networks.IESSP:基于信息提取的深度神经网络稀疏条纹剪枝方法
Sensors (Basel). 2025 Apr 3;25(7):2261. doi: 10.3390/s25072261.
2
Dynamical Conventional Neural Network Channel Pruning by Genetic Wavelet Channel Search for Image Classification.基于遗传小波通道搜索的动态传统神经网络通道剪枝用于图像分类
Front Comput Neurosci. 2021 Oct 27;15:760554. doi: 10.3389/fncom.2021.760554. eCollection 2021.
3
Weak sub-network pruning for strong and efficient neural networks.弱子网络剪枝技术:构建强大而高效的神经网络
Neural Netw. 2021 Dec;144:614-626. doi: 10.1016/j.neunet.2021.09.015. Epub 2021 Sep 30.
4
Random pruning: channel sparsity by expectation scaling factor.随机剪枝:通过期望缩放因子实现通道稀疏性
PeerJ Comput Sci. 2023 Sep 5;9:e1564. doi: 10.7717/peerj-cs.1564. eCollection 2023.
5
SpQuant-SNN: ultra-low precision membrane potential with sparse activations unlock the potential of on-device spiking neural networks applications.SpQuant-SNN:具有稀疏激活的超低精度膜电位开启了片上脉冲神经网络应用的潜力。
Front Neurosci. 2024 Sep 4;18:1440000. doi: 10.3389/fnins.2024.1440000. eCollection 2024.
6
Adding Before Pruning: Sparse Filter Fusion for Deep Convolutional Neural Networks via Auxiliary Attention.剪枝前添加:通过辅助注意力实现深度卷积神经网络的稀疏滤波器融合
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):3930-3942. doi: 10.1109/TNNLS.2021.3106917. Epub 2025 Feb 28.
7
Filter Pruning via Measuring Feature Map Information.通过测量特征图信息进行滤波器剪枝
Sensors (Basel). 2021 Oct 2;21(19):6601. doi: 10.3390/s21196601.
8
A multi-agent reinforcement learning based approach for automatic filter pruning.一种基于多智能体强化学习的自动滤波器剪枝方法。
Sci Rep. 2024 Dec 28;14(1):31193. doi: 10.1038/s41598-024-82562-w.
9
SAAF: Self-Adaptive Attention Factor-Based Taylor-Pruning on Convolutional Neural Networks.SAAF:基于自适应注意力因子的卷积神经网络泰勒剪枝法
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):8540-8553. doi: 10.1109/TNNLS.2024.3439435. Epub 2025 May 2.
10
Automatic Sparse Connectivity Learning for Neural Networks.神经网络的自动稀疏连接学习
IEEE Trans Neural Netw Learn Syst. 2023 Oct;34(10):7350-7364. doi: 10.1109/TNNLS.2022.3141665. Epub 2023 Oct 5.

本文引用的文献

1
A Survey on Deep Neural Network Pruning: Taxonomy, Comparison, Analysis, and Recommendations.深度神经网络剪枝研究综述:分类、比较、分析与建议
IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):10558-10578. doi: 10.1109/TPAMI.2024.3447085. Epub 2024 Nov 6.
2
A Comparative Study of Preprocessing and Model Compression Techniques in Deep Learning for Forest Sound Classification.深度学习中用于森林声音分类的预处理和模型压缩技术的比较研究
Sensors (Basel). 2024 Feb 9;24(4):1149. doi: 10.3390/s24041149.
3
A Novel Deep-Learning Model Compression Based on Filter-Stripe Group Pruning and Its IoT Application.
一种基于滤波带分组剪枝的新型深度学习模型压缩及其在物联网中的应用。
Sensors (Basel). 2022 Jul 27;22(15):5623. doi: 10.3390/s22155623.
4
SOKS: Automatic Searching of the Optimal Kernel Shapes for Stripe-Wise Network Pruning.SOKS:用于逐条纹网络剪枝的最优内核形状自动搜索
IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):9912-9924. doi: 10.1109/TNNLS.2022.3162067. Epub 2023 Nov 30.
5
Automatic Sparse Connectivity Learning for Neural Networks.神经网络的自动稀疏连接学习
IEEE Trans Neural Netw Learn Syst. 2023 Oct;34(10):7350-7364. doi: 10.1109/TNNLS.2022.3141665. Epub 2023 Oct 5.
6
Filter Pruning via Measuring Feature Map Information.通过测量特征图信息进行滤波器剪枝
Sensors (Basel). 2021 Oct 2;21(19):6601. doi: 10.3390/s21196601.
7
Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks.渐进式软滤波器剪枝在深度卷积神经网络中的应用。
IEEE Trans Cybern. 2020 Aug;50(8):3594-3604. doi: 10.1109/TCYB.2019.2933477. Epub 2019 Aug 27.
8
Tensor completion for estimating missing values in visual data.张量完成在视觉数据中估计缺失值。
IEEE Trans Pattern Anal Mach Intell. 2013 Jan;35(1):208-20. doi: 10.1109/TPAMI.2012.39.