• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

变分贝叶斯学习中一般混合模型的随机复杂性

Stochastic complexities of general mixture models in variational Bayesian learning.

作者信息

Watanabe Kazuho, Watanabe Sumio

机构信息

Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Japan.

出版信息

Neural Netw. 2007 Mar;20(2):210-9. doi: 10.1016/j.neunet.2006.05.030. Epub 2006 Aug 10.

DOI:10.1016/j.neunet.2006.05.030
PMID:16904288
Abstract

In this paper, we focus on variational Bayesian learning of general mixture models. Variational Bayesian learning was proposed as an approximation of Bayesian learning. While it has provided computational tractability and good generalization in many applications, little has been done to investigate its theoretical properties. The asymptotic form was obtained for the stochastic complexity, or the free energy in the variational Bayesian learning of a mixture of exponential-family distributions, which is the main contribution this paper makes. We reveal that the stochastic complexities become smaller than those of regular statistical models, which implies that the advantages of Bayesian learning are still retained in variational Bayesian learning. Moreover, the derived bounds indicate what influence the hyperparameters have on the learning process, and the accuracy of the variational Bayesian approach as an approximation of true Bayesian learning.

摘要

在本文中,我们专注于一般混合模型的变分贝叶斯学习。变分贝叶斯学习是作为贝叶斯学习的一种近似方法而提出的。虽然它在许多应用中提供了计算上的可处理性和良好的泛化能力,但对于研究其理论性质却做得很少。本文的主要贡献是获得了指数族分布混合的变分贝叶斯学习中随机复杂度(即自由能)的渐近形式。我们揭示了随机复杂度变得比常规统计模型的随机复杂度更小,这意味着贝叶斯学习的优势在变分贝叶斯学习中仍然得以保留。此外,所推导的界表明了超参数对学习过程有何种影响,以及变分贝叶斯方法作为真实贝叶斯学习近似的准确性。

相似文献

1
Stochastic complexities of general mixture models in variational Bayesian learning.变分贝叶斯学习中一般混合模型的随机复杂性
Neural Netw. 2007 Mar;20(2):210-9. doi: 10.1016/j.neunet.2006.05.030. Epub 2006 Aug 10.
2
Stochastic complexities of reduced rank regression in Bayesian estimation.贝叶斯估计中降秩回归的随机复杂性
Neural Netw. 2005 Sep;18(7):924-33. doi: 10.1016/j.neunet.2005.03.014.
3
Singularities in mixture models and upper bounds of stochastic complexity.混合模型中的奇点与随机复杂度的上界
Neural Netw. 2003 Sep;16(7):1029-38. doi: 10.1016/S0893-6080(03)00005-4.
4
Globally multimodal problem optimization via an estimation of distribution algorithm based on unsupervised learning of Bayesian networks.基于贝叶斯网络无监督学习的分布估计算法的全局多模态问题优化
Evol Comput. 2005 Spring;13(1):43-66. doi: 10.1162/1063656053583432.
5
Asymptotic analysis of Bayesian generalization error with Newton diagram.Newton 图的贝叶斯泛化误差的渐近分析。
Neural Netw. 2010 Jan;23(1):35-43. doi: 10.1016/j.neunet.2009.07.029. Epub 2009 Aug 7.
6
Divergence measures and a general framework for local variational approximation.分歧测度与局部变分逼近的一般框架。
Neural Netw. 2011 Dec;24(10):1102-9. doi: 10.1016/j.neunet.2011.06.004. Epub 2011 Jun 15.
7
Variational free energy and the Laplace approximation.变分自由能与拉普拉斯近似
Neuroimage. 2007 Jan 1;34(1):220-34. doi: 10.1016/j.neuroimage.2006.08.035. Epub 2006 Oct 20.
8
Latent-space variational bayes.潜在空间变分贝叶斯
IEEE Trans Pattern Anal Mach Intell. 2008 Dec;30(12):2236-42. doi: 10.1109/TPAMI.2008.157.
9
Propagation and control of stochastic signals through universal learning networks.随机信号通过通用学习网络的传播与控制。
Neural Netw. 2006 May;19(4):487-99. doi: 10.1016/j.neunet.2005.10.005. Epub 2006 Jan 18.
10
Variational bayesian blind deconvolution using a total variation prior.使用全变差先验的变分贝叶斯盲反卷积
IEEE Trans Image Process. 2009 Jan;18(1):12-26. doi: 10.1109/TIP.2008.2007354.