• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种推导规模界限并训练多层神经网络的简单方法。

A simple method to derive bounds on the size and to train multilayer neural networks.

作者信息

Sartori M A, Antsaklis P J

机构信息

Dept. of Electr. Eng., Notre Dame Univ., IN.

出版信息

IEEE Trans Neural Netw. 1991;2(4):467-71. doi: 10.1109/72.88168.

DOI:10.1109/72.88168
PMID:18276399
Abstract

A new derivation is presented for the bounds on the size of a multilayer neural network to exactly implement an arbitrary training set; namely the training set can be implemented with zero error with two layers and with the number of the hidden-layer neurons equal to #1>/= p-1. The derivation does not require the separation of the input space by particular hyperplanes, as in previous derivations. The weights for the hidden layer can be chosen almost arbitrarily, and the weights for the output layer can be found by solving #1+1 linear equations. The method presented exactly solves (M), the multilayer neural network training problem, for any arbitrary training set.

摘要

本文给出了一个新的推导,用于确定多层神经网络规模的边界,以精确实现任意训练集;也就是说,对于两层神经网络,当隐藏层神经元数量满足#1>/= p - 1时,训练集可以零误差实现。与之前的推导不同,该推导不需要通过特定超平面来分离输入空间。隐藏层的权重几乎可以任意选择,输出层的权重可以通过求解#1 + 1个线性方程得到。所提出的方法能针对任何任意训练集精确求解(M),即多层神经网络训练问题。

相似文献

1
A simple method to derive bounds on the size and to train multilayer neural networks.一种推导规模界限并训练多层神经网络的简单方法。
IEEE Trans Neural Netw. 1991;2(4):467-71. doi: 10.1109/72.88168.
2
Upper bounds on the number of hidden neurons in feedforward networks with arbitrary bounded nonlinear activation functions.具有任意有界非线性激活函数的前馈网络中隐藏神经元数量的上限。
IEEE Trans Neural Netw. 1998;9(1):224-9. doi: 10.1109/72.655045.
3
The No-Prop algorithm: a new learning algorithm for multilayer neural networks.无推动算法:一种多层神经网络的新学习算法。
Neural Netw. 2013 Jan;37:182-8. doi: 10.1016/j.neunet.2012.09.020. Epub 2012 Oct 15.
4
Bounds on the number of hidden neurons in three-layer binary neural networks.三层二进制神经网络中隐藏神经元数量的界限。
Neural Netw. 2003 Sep;16(7):995-1002. doi: 10.1016/S0893-6080(03)00006-6.
5
Novel maximum-margin training algorithms for supervised neural networks.用于监督神经网络的新型最大间隔训练算法。
IEEE Trans Neural Netw. 2010 Jun;21(6):972-84. doi: 10.1109/TNN.2010.2046423. Epub 2010 Apr 19.
6
A Sequential Learning Approach for Single Hidden Layer Neural Networks.
Neural Netw. 1998 Jan;11(1):65-80. doi: 10.1016/s0893-6080(97)00111-1.
7
A fast multilayer neural-network training algorithm based on the layer-by-layer optimizing procedures.一种基于逐层优化过程的快速多层神经网络训练算法。
IEEE Trans Neural Netw. 1996;7(3):768-75. doi: 10.1109/72.501734.
8
A learning rule for very simple universal approximators consisting of a single layer of perceptrons.一种由单层感知器组成的非常简单的通用逼近器的学习规则。
Neural Netw. 2008 Jun;21(5):786-95. doi: 10.1016/j.neunet.2007.12.036. Epub 2007 Dec 31.
9
On the initialization and optimization of multilayer perceptrons.关于多层感知器的初始化与优化
IEEE Trans Neural Netw. 1994;5(5):738-51. doi: 10.1109/72.317726.
10
Derivation of the multilayer perceptron weight constraints for direct network interpretation and knowledge discovery.
Neural Netw. 1999 Nov;12(9):1259-1271. doi: 10.1016/s0893-6080(99)00062-3.

引用本文的文献

1
On predicting annual output energy of 4-terminal perovskite/silicon tandem PV cells for building integrated photovoltaic application using machine learning.利用机器学习预测用于建筑一体化光伏应用的四端钙钛矿/硅串联光伏电池的年输出能量
Heliyon. 2023 Jul 13;9(7):e18097. doi: 10.1016/j.heliyon.2023.e18097. eCollection 2023 Jul.
2
Determining the number of hidden layer and hidden neuron of neural network for wind speed prediction.确定用于风速预测的神经网络的隐藏层数和隐藏神经元数量。
PeerJ Comput Sci. 2021 Sep 20;7:e724. doi: 10.7717/peerj-cs.724. eCollection 2021.