Watanabe Kazuho, Watanabe Sumio
Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, Japan.
Neural Netw. 2007 Mar;20(2):210-9. doi: 10.1016/j.neunet.2006.05.030. Epub 2006 Aug 10.
In this paper, we focus on variational Bayesian learning of general mixture models. Variational Bayesian learning was proposed as an approximation of Bayesian learning. While it has provided computational tractability and good generalization in many applications, little has been done to investigate its theoretical properties. The asymptotic form was obtained for the stochastic complexity, or the free energy in the variational Bayesian learning of a mixture of exponential-family distributions, which is the main contribution this paper makes. We reveal that the stochastic complexities become smaller than those of regular statistical models, which implies that the advantages of Bayesian learning are still retained in variational Bayesian learning. Moreover, the derived bounds indicate what influence the hyperparameters have on the learning process, and the accuracy of the variational Bayesian approach as an approximation of true Bayesian learning.
在本文中,我们专注于一般混合模型的变分贝叶斯学习。变分贝叶斯学习是作为贝叶斯学习的一种近似方法而提出的。虽然它在许多应用中提供了计算上的可处理性和良好的泛化能力,但对于研究其理论性质却做得很少。本文的主要贡献是获得了指数族分布混合的变分贝叶斯学习中随机复杂度(即自由能)的渐近形式。我们揭示了随机复杂度变得比常规统计模型的随机复杂度更小,这意味着贝叶斯学习的优势在变分贝叶斯学习中仍然得以保留。此外,所推导的界表明了超参数对学习过程有何种影响,以及变分贝叶斯方法作为真实贝叶斯学习近似的准确性。