Suppr超能文献

The Bayesian evidence scheme for regularizing probability-density estimating neural networks.

作者信息

Husmeier D

机构信息

Biomathematics and Statistics Scotland, Scottish Crop Research Institute, Dundee, UK.

出版信息

Neural Comput. 2000 Nov;12(11):2685-717. doi: 10.1162/089976600300014890.

Abstract

Training probability-density estimating neural networks with the expectation-maximization (EM) algorithm aims to maximize the likelihood of the training set and therefore leads to overfitting for sparse data. In this article, a regularization method for mixture models with generalized linear kernel centers is proposed, which adopts the Bayesian evidence approach and optimizes the hyperparameters of the prior by type II maximum likelihood. This includes a marginalization over the parameters, which is done by Laplace approximation and requires the derivation of the Hessian of the log-likelihood function. The incorporation of this approach into the standard training scheme leads to a modified form of the EM algorithm, which includes a regularization term and adapts the hyperparameters on-line after each EM cycle. The article presents applications of this scheme to classification problems, the prediction of stochastic time series, and latent space models.

摘要

相似文献

1
The Bayesian evidence scheme for regularizing probability-density estimating neural networks.
Neural Comput. 2000 Nov;12(11):2685-717. doi: 10.1162/089976600300014890.
2
Recursive Bayesian recurrent neural networks for time-series modeling.
IEEE Trans Neural Netw. 2010 Feb;21(2):262-74. doi: 10.1109/TNN.2009.2036174. Epub 2009 Dec 28.
4
Density-driven generalized regression neural networks (DD-GRNN) for function approximation.
IEEE Trans Neural Netw. 2007 Nov;18(6):1683-96. doi: 10.1109/TNN.2007.902730.
5
Regularized variational Bayesian learning of echo state networks with delay&sum readout.
Neural Comput. 2012 Apr;24(4):967-95. doi: 10.1162/NECO_a_00253. Epub 2011 Dec 14.
6
Bayesian Gaussian process classification with the EM-EP algorithm.
IEEE Trans Pattern Anal Mach Intell. 2006 Dec;28(12):1948-59. doi: 10.1109/TPAMI.2006.238.
7
Stochastic complexities of general mixture models in variational Bayesian learning.
Neural Netw. 2007 Mar;20(2):210-9. doi: 10.1016/j.neunet.2006.05.030. Epub 2006 Aug 10.
8
Invariance priors for Bayesian feed-forward neural networks.
Neural Netw. 2006 Dec;19(10):1550-7. doi: 10.1016/j.neunet.2006.01.017. Epub 2006 Mar 31.
9
Sparse Bayesian Classification of EEG for Brain-Computer Interface.
IEEE Trans Neural Netw Learn Syst. 2016 Nov;27(11):2256-2267. doi: 10.1109/TNNLS.2015.2476656. Epub 2015 Sep 23.
10
A Bayesian approach to joint feature selection and classifier design.
IEEE Trans Pattern Anal Mach Intell. 2004 Sep;26(9):1105-11. doi: 10.1109/TPAMI.2004.55.

引用本文的文献

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验