• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种用于前馈神经网络剪枝的惩罚函数方法。

A penalty-function approach for pruning feedforward neural networks.

作者信息

Setiono R

机构信息

Department of Information Systems and Computer Science, National University of Singapore, Kent Ridge, Republic of Singapore.

出版信息

Neural Comput. 1997 Jan 1;9(1):185-204. doi: 10.1162/neco.1997.9.1.185.

DOI:10.1162/neco.1997.9.1.185
PMID:9117898
Abstract

This article proposes the use of a penalty function for pruning feedforward neural network by weight elimination. The penalty function proposed consists of two terms. The first term is to discourage the use of unnecessary connections, and the second term is to prevent the weights of the connections from taking excessively large values. Simple criteria for eliminating weights from the network are also given. The effectiveness of this penalty function is tested on three well-known problems: the contiguity problem, the parity problems, and the monks problems. The resulting pruned networks obtained for many of these problems have fewer connections than previously reported in the literature.

摘要

本文提出通过权重消除使用惩罚函数来修剪前馈神经网络。所提出的惩罚函数由两项组成。第一项是抑制使用不必要的连接,第二项是防止连接的权重取值过大。还给出了从网络中消除权重的简单标准。在三个著名问题上测试了该惩罚函数的有效性:邻接问题、奇偶问题和僧侣问题。针对其中许多问题得到的修剪后的网络比文献中先前报道的连接更少。

相似文献

1
A penalty-function approach for pruning feedforward neural networks.一种用于前馈神经网络剪枝的惩罚函数方法。
Neural Comput. 1997 Jan 1;9(1):185-204. doi: 10.1162/neco.1997.9.1.185.
2
Pruning artificial neural networks using neural complexity measures.使用神经复杂性度量来修剪人工神经网络。
Int J Neural Syst. 2008 Oct;18(5):389-403. doi: 10.1142/S012906570800166X.
3
Extracting rules from neural networks by pruning and hidden-unit splitting.通过剪枝和隐藏单元拆分从神经网络中提取规则。
Neural Comput. 1997 Jan 1;9(1):205-25. doi: 10.1162/neco.1997.9.1.205.
4
Boundedness and convergence of online gradient method with penalty for feedforward neural networks.带惩罚项的前馈神经网络在线梯度法的有界性与收敛性
IEEE Trans Neural Netw. 2009 Jun;20(6):1050-4. doi: 10.1109/TNN.2009.2020848. Epub 2009 May 8.
5
EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks.EvoPruneDeepTL:一种用于基于迁移学习的深度神经网络的进化剪枝模型。
Neural Netw. 2023 Jan;158:59-82. doi: 10.1016/j.neunet.2022.10.011. Epub 2022 Nov 4.
6
Radical pruning: a method to construct skeleton radial basis function networks.激进剪枝:一种构建骨架径向基函数网络的方法。
Int J Neural Syst. 2000 Apr;10(2):143-54. doi: 10.1142/S0129065700000120.
7
Constructive training methods for feedforward neural networks with binary weights.具有二进制权重的前馈神经网络的建设性训练方法。
Int J Neural Syst. 1996 May;7(2):149-66. doi: 10.1142/s0129065796000129.
8
New training strategies for constructive neural networks with application to regression problems.用于构造性神经网络的新训练策略及其在回归问题中的应用。
Neural Netw. 2004 May;17(4):589-609. doi: 10.1016/j.neunet.2004.02.002.
9
An iterative pruning algorithm for feedforward neural networks.一种用于前馈神经网络的迭代剪枝算法。
IEEE Trans Neural Netw. 1997;8(3):519-31. doi: 10.1109/72.572092.
10
A sequential learning scheme for function approximation using minimal radial basis function neural networks.一种使用最小径向基函数神经网络进行函数逼近的序列学习方案。
Neural Comput. 1997 Feb 15;9(2):461-78. doi: 10.1162/neco.1997.9.2.461.

引用本文的文献

1
Interplay of multiple synaptic plasticity features in filamentary memristive devices for neuromorphic computing.丝状忆阻器器件中多种突触可塑性特征的相互作用及其在神经形态计算中的应用。
Sci Rep. 2016 Dec 16;6:39216. doi: 10.1038/srep39216.