IEEE Trans Neural Netw Learn Syst. 2012 Dec;23(12):1862-71. doi: 10.1109/TNNLS.2012.2217986.
Gaussian processes (GPs) constitute one of the most important Bayesian machine learning approaches, based on a particularly effective method for placing a prior distribution over the space of regression functions. Several researchers have considered postulating mixtures of GPs as a means of dealing with nonstationary covariance functions, discontinuities, multimodality, and overlapping output signals. In existing works, mixtures of GPs are based on the introduction of a gating function defined over the space of model input variables. This way, each postulated mixture component GP is effectively restricted in a limited subset of the input space. In this paper, we follow a different approach. We consider a fully generative nonparametric Bayesian model with power-law behavior, generating GPs over the whole input space of the learned task. We provide an efficient algorithm for model inference, based on the variational Bayesian framework, and prove its efficacy using benchmark and real-world datasets.
高斯过程(Gaussian processes,简称 GPs)是基于一种特别有效的回归函数空间先验分布推断方法的最重要的贝叶斯机器学习方法之一。一些研究人员认为,混合高斯过程是处理非平稳协方差函数、不连续性、多模态和重叠输出信号的一种方法。在现有的工作中,混合高斯过程是基于在模型输入变量空间上定义的门控函数的引入。这样,每个假定的混合成分 GP 实际上都被限制在输入空间的有限子集内。在本文中,我们采用了一种不同的方法。我们考虑了一种具有幂律行为的完全生成性非参数贝叶斯模型,在学习任务的整个输入空间上生成高斯过程。我们提供了一种基于变分贝叶斯框架的模型推断的有效算法,并使用基准数据集和真实数据集证明了其有效性。