• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

浅层感知器网络逼近的概率下界

Probabilistic lower bounds for approximation by shallow perceptron networks.

作者信息

Kůrková Věra, Sanguineti Marcello

机构信息

Institute of Computer Science, Czech Academy of Sciences, Pod Vodárenskou věží, 2 - 18207 Prague, Czech Republic.

DIBRIS, University of Genova, Via Opera Pia, 13 - 16145 Genova, Italy.

出版信息

Neural Netw. 2017 Jul;91:34-41. doi: 10.1016/j.neunet.2017.04.003. Epub 2017 Apr 19.

DOI:10.1016/j.neunet.2017.04.003
PMID:28482227
Abstract

Limitations of approximation capabilities of shallow perceptron networks are investigated. Lower bounds on approximation errors are derived for binary-valued functions on finite domains. It is proven that unless the number of network units is sufficiently large (larger than any polynomial of the logarithm of the size of the domain) a good approximation cannot be achieved for almost any uniformly randomly chosen function on a given domain. The results are obtained by combining probabilistic Chernoff-Hoeffding bounds with estimates of the sizes of sets of functions exactly computable by shallow networks with increasing numbers of units.

摘要

研究了浅层感知器网络逼近能力的局限性。推导了有限域上二值函数逼近误差的下界。证明了除非网络单元的数量足够大(大于域大小对数的任何多项式),否则对于给定域上几乎任何均匀随机选择的函数,都无法实现良好的逼近。这些结果是通过将概率切尔诺夫 - 霍夫丁界与具有不断增加单元数量的浅层网络可精确计算的函数集大小估计相结合而获得的。

相似文献

1
Probabilistic lower bounds for approximation by shallow perceptron networks.浅层感知器网络逼近的概率下界
Neural Netw. 2017 Jul;91:34-41. doi: 10.1016/j.neunet.2017.04.003. Epub 2017 Apr 19.
2
Classification by Sparse Neural Networks.稀疏神经网络分类。
IEEE Trans Neural Netw Learn Syst. 2019 Sep;30(9):2746-2754. doi: 10.1109/TNNLS.2018.2888517. Epub 2019 Jan 10.
3
Minimization of error functionals over perceptron networks.感知器网络上误差泛函的最小化。
Neural Comput. 2008 Jan;20(1):252-70. doi: 10.1162/neco.2008.20.1.252.
4
Approximation of classifiers by deep perceptron networks.通过深度感知机网络对分类器进行逼近。
Neural Netw. 2023 Aug;165:654-661. doi: 10.1016/j.neunet.2023.06.004. Epub 2023 Jun 7.
5
An integral upper bound for neural network approximation.神经网络逼近的一个积分上界。
Neural Comput. 2009 Oct;21(10):2970-89. doi: 10.1162/neco.2009.04-08-745.
6
Dimension independent bounds for general shallow networks.广义浅层网络的维数无关界。
Neural Netw. 2020 Mar;123:142-152. doi: 10.1016/j.neunet.2019.11.006. Epub 2019 Nov 22.
7
Error bounds for approximations with deep ReLU networks.深度 ReLU 网络逼近的误差界。
Neural Netw. 2017 Oct;94:103-114. doi: 10.1016/j.neunet.2017.07.002. Epub 2017 Jul 13.
8
Can dictionary-based computational models outperform the best linear ones?基于词典的计算模型能优于最好的线性模型吗?
Neural Netw. 2011 Oct;24(8):881-7. doi: 10.1016/j.neunet.2011.05.014. Epub 2011 Jun 12.
9
A learning rule for very simple universal approximators consisting of a single layer of perceptrons.一种由单层感知器组成的非常简单的通用逼近器的学习规则。
Neural Netw. 2008 Jun;21(5):786-95. doi: 10.1016/j.neunet.2007.12.036. Epub 2007 Dec 31.
10
The capacity of feedforward neural networks.前馈神经网络的容量。
Neural Netw. 2019 Aug;116:288-311. doi: 10.1016/j.neunet.2019.04.009. Epub 2019 Apr 22.

引用本文的文献

1
Universal approximation with quadratic deep networks.二次深度网络的通用逼近。
Neural Netw. 2020 Apr;124:383-392. doi: 10.1016/j.neunet.2020.01.007. Epub 2020 Jan 18.
2
The use of back propagation neural networks and 18F-Florbetapir PET for early detection of Alzheimer's disease using Alzheimer's Disease Neuroimaging Initiative database.利用反向传播神经网络和 18F-Florbetapir PET 从阿尔茨海默病神经影像学倡议数据库对阿尔茨海默病进行早期检测。
PLoS One. 2019 Dec 26;14(12):e0226577. doi: 10.1371/journal.pone.0226577. eCollection 2019.
3
Blessing of dimensionality: mathematical foundations of the statistical physics of data.
维度的祝福:数据统计物理学的数学基础
Philos Trans A Math Phys Eng Sci. 2018 Apr 28;376(2118). doi: 10.1098/rsta.2017.0237.