• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于 Lempel-Ziv 复杂度的神经网络剪枝算法。

A Lempel-Ziv complexity-based neural network pruning algorithm.

机构信息

Department of Electronics and Communication Engineering, Khulna University of Engineering and Technology, Khulna-9203, Bangladesh.

出版信息

Int J Neural Syst. 2011 Oct;21(5):427-41. doi: 10.1142/S0129065711002936.

DOI:10.1142/S0129065711002936
PMID:21956934
Abstract

This paper presents a pruning method for artificial neural networks (ANNs) based on the 'Lempel-Ziv complexity' (LZC) measure. We call this method the 'silent pruning algorithm' (SPA). The term 'silent' is used in the sense that SPA prunes ANNs without causing much disturbance during the network training. SPA prunes hidden units during the training process according to their ranks computed from LZC. LZC extracts the number of unique patterns in a time sequence obtained from the output of a hidden unit and a smaller value of LZC indicates higher redundancy of a hidden unit. SPA has a great resemblance to biological brains since it encourages higher complexity during the training process. SPA is similar to, yet different from, existing pruning algorithms. The algorithm has been tested on a number of challenging benchmark problems in machine learning, including cancer, diabetes, heart, card, iris, glass, thyroid, and hepatitis problems. We compared SPA with other pruning algorithms and we found that SPA is better than the 'random deletion algorithm' (RDA) which prunes hidden units randomly. Our experimental results show that SPA can simplify ANNs with good generalization ability.

摘要

本文提出了一种基于“Lempel-Ziv 复杂度”(LZC)度量的人工神经网络(ANNs)剪枝方法。我们称这种方法为“静默剪枝算法”(SPA)。“静默”一词是指 SPA 在不引起网络训练过程中过多干扰的情况下对 ANNs 进行剪枝。SPA 根据 LZC 计算的排名在训练过程中修剪隐藏单元。LZC 从隐藏单元的输出中提取时间序列中的独特模式数量,较小的 LZC 值表示隐藏单元的冗余度更高。SPA 与生物大脑非常相似,因为它在训练过程中鼓励更高的复杂性。SPA 类似于现有的剪枝算法,但又有所不同。该算法已经在机器学习中的一些具有挑战性的基准问题上进行了测试,包括癌症、糖尿病、心脏、卡片、虹膜、玻璃、甲状腺和肝炎问题。我们将 SPA 与其他剪枝算法进行了比较,发现 SPA 优于随机删除算法(RDA),后者随机修剪隐藏单元。我们的实验结果表明,SPA 可以简化具有良好泛化能力的 ANNs。

相似文献

1
A Lempel-Ziv complexity-based neural network pruning algorithm.基于 Lempel-Ziv 复杂度的神经网络剪枝算法。
Int J Neural Syst. 2011 Oct;21(5):427-41. doi: 10.1142/S0129065711002936.
2
A new adaptive merging and growing algorithm for designing artificial neural networks.一种用于设计人工神经网络的新型自适应合并与增长算法。
IEEE Trans Syst Man Cybern B Cybern. 2009 Jun;39(3):705-22. doi: 10.1109/TSMCB.2008.2008724. Epub 2009 Feb 6.
3
A new algorithm to design compact two-hidden-layer artificial neural networks.一种用于设计紧凑型双隐藏层人工神经网络的新算法。
Neural Netw. 2001 Nov;14(9):1265-78. doi: 10.1016/s0893-6080(01)00075-2.
4
Pruning artificial neural networks using neural complexity measures.使用神经复杂性度量来修剪人工神经网络。
Int J Neural Syst. 2008 Oct;18(5):389-403. doi: 10.1142/S012906570800166X.
5
Quantum-based algorithm for optimizing artificial neural networks.基于量子的人工神经网络优化算法。
IEEE Trans Neural Netw Learn Syst. 2013 Aug;24(8):1266-78. doi: 10.1109/TNNLS.2013.2249089.
6
A new constructive algorithm for architectural and functional adaptation of artificial neural networks.一种用于人工神经网络架构和功能自适应的新型构造算法。
IEEE Trans Syst Man Cybern B Cybern. 2009 Dec;39(6):1590-605. doi: 10.1109/TSMCB.2009.2021849. Epub 2009 Jun 5.
7
Bagging and boosting negatively correlated neural networks.装袋法和提升法与神经网络呈负相关。
IEEE Trans Syst Man Cybern B Cybern. 2008 Jun;38(3):771-84. doi: 10.1109/TSMCB.2008.922055.
8
A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation.超基函数神经网络用于函数逼近的生长和修剪序贯学习算法。
Neural Netw. 2013 Oct;46:210-26. doi: 10.1016/j.neunet.2013.06.004. Epub 2013 Jun 14.
9
A node pruning algorithm based on a Fourier amplitude sensitivity test method.
IEEE Trans Neural Netw. 2006 Mar;17(2):273-93. doi: 10.1109/TNN.2006.871707.
10
An improvement of extreme learning machine for compact single-hidden-layer feedforward neural networks.用于紧凑型单隐层前馈神经网络的极限学习机改进方法。
Int J Neural Syst. 2008 Oct;18(5):433-41. doi: 10.1142/S0129065708001695.

引用本文的文献

1
Cognitive tasks and cerebral blood flow through anterior cerebral arteries: a study via functional transcranial Doppler ultrasound recordings.认知任务与通过大脑前动脉的脑血流:一项通过功能性经颅多普勒超声记录的研究。
BMC Med Imaging. 2016 Mar 12;16:22. doi: 10.1186/s12880-016-0125-0.
2
A cerebral blood flow evaluation during cognitive tasks following a cervical spinal cord injury: a case study using transcranial Doppler recordings.颈椎脊髓损伤后认知任务期间的脑血流评估:一项使用经颅多普勒记录的案例研究。
Cogn Neurodyn. 2015 Dec;9(6):615-26. doi: 10.1007/s11571-015-9355-z. Epub 2015 Sep 3.
3
Cognitive tasks during walking affect cerebral blood flow signal features in middle cerebral arteries and their correlation to gait characteristics.
行走过程中的认知任务会影响大脑中动脉的脑血流信号特征及其与步态特征的相关性。
Behav Brain Funct. 2015 Sep 26;11(1):29. doi: 10.1186/s12993-015-0073-9.