• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

Consistency of posterior distributions for neural networks.

作者信息

Lee H K

机构信息

Institute of Statistics and Decision Sciences, Duke University, Durham, NC 27708, USA.

出版信息

Neural Netw. 2000 Jul;13(6):629-42. doi: 10.1016/s0893-6080(00)00045-9.

DOI:10.1016/s0893-6080(00)00045-9
PMID:10987516
Abstract

In this paper we show that the posterior distribution for feedforward neural networks is asymptotically consistent. This paper extends earlier results on universal approximation properties of neural networks to the Bayesian setting. The proof of consistency embeds the problem in a density estimation problem, then uses bounds on the bracketing entropy to show that the posterior is consistent over Hellinger neighborhoods. It then relates this result back to the regression setting. We show consistency in both the setting of the number of hidden nodes growing with the sample size, and in the case where the number of hidden nodes is treated as a parameter. Thus we provide a theoretical justification for using neural networks for nonparametric regression in a Bayesian framework.

摘要

相似文献

1
Consistency of posterior distributions for neural networks.
Neural Netw. 2000 Jul;13(6):629-42. doi: 10.1016/s0893-6080(00)00045-9.
2
Statistical foundation of Variational Bayes neural networks.变分贝叶斯神经网络的统计基础。
Neural Netw. 2021 May;137:151-173. doi: 10.1016/j.neunet.2021.01.027. Epub 2021 Feb 5.
3
Bayesian ARTMAP for regression.贝叶斯 ART MAP 回归。
Neural Netw. 2013 Oct;46:23-31. doi: 10.1016/j.neunet.2013.04.006. Epub 2013 Apr 22.
4
Divergence measures and a general framework for local variational approximation.分歧测度与局部变分逼近的一般框架。
Neural Netw. 2011 Dec;24(10):1102-9. doi: 10.1016/j.neunet.2011.06.004. Epub 2011 Jun 15.
5
Stochastic complexities of reduced rank regression in Bayesian estimation.贝叶斯估计中降秩回归的随机复杂性
Neural Netw. 2005 Sep;18(7):924-33. doi: 10.1016/j.neunet.2005.03.014.
6
Sequential Bayesian kernel modelling with non-Gaussian noise.具有非高斯噪声的序贯贝叶斯核建模
Neural Netw. 2008 Jan;21(1):36-47. doi: 10.1016/j.neunet.2007.08.001. Epub 2007 Oct 9.
7
Density-driven generalized regression neural networks (DD-GRNN) for function approximation.用于函数逼近的密度驱动广义回归神经网络(DD-GRNN)。
IEEE Trans Neural Netw. 2007 Nov;18(6):1683-96. doi: 10.1109/TNN.2007.902730.
8
On the relationship between deterministic and probabilistic directed Graphical models: from Bayesian networks to recursive neural networks.关于确定性和概率性有向图模型之间的关系:从贝叶斯网络到递归神经网络。
Neural Netw. 2005 Oct;18(8):1080-6. doi: 10.1016/j.neunet.2005.07.007. Epub 2005 Sep 12.
9
Local coupled feedforward neural network.局部耦合前馈神经网络。
Neural Netw. 2010 Jan;23(1):108-13. doi: 10.1016/j.neunet.2009.06.016. Epub 2009 Jun 30.
10
Stochastic complexities of general mixture models in variational Bayesian learning.变分贝叶斯学习中一般混合模型的随机复杂性
Neural Netw. 2007 Mar;20(2):210-9. doi: 10.1016/j.neunet.2006.05.030. Epub 2006 Aug 10.

引用本文的文献

1
Gradient-flow adaptive importance sampling for Bayesian leave one out cross-validation with application to sigmoidal classification models.用于贝叶斯留一法交叉验证的梯度流自适应重要性抽样及其在Sigmoid分类模型中的应用
ArXiv. 2024 Oct 20:arXiv:2402.08151v2.
2
Transfer Learning in Multiple Hypothesis Testing.多重假设检验中的迁移学习
Entropy (Basel). 2024 Jan 4;26(1):0. doi: 10.3390/e26010049.