• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

循环高斯过程模型的随机变分框架。

A stochastic variational framework for Recurrent Gaussian Processes models.

机构信息

Computer Science Department (DC), Federal University of Ceará (UFC), Center of Sciences, Campus of Pici, Fortaleza, Ceará, Brazil; Department of Teleinformatics Engineering (DETI), Federal University of Ceará (UFC), Center of Technology, Campus of Pici, Fortaleza, Ceará, Brazil.

出版信息

Neural Netw. 2019 Apr;112:54-72. doi: 10.1016/j.neunet.2019.01.005. Epub 2019 Feb 1.

DOI:10.1016/j.neunet.2019.01.005
PMID:30753963
Abstract

Gaussian Processes (GPs) models have been successfully applied to the problem of learning from sequential observations. In such context, the family of Recurrent Gaussian Processes (RGPs) have been recently introduced with a specifically designed structure to handle dynamical data. However, RGPs present a limitation shared by most GP approaches: they become computationally infeasible when facing very large datasets. In the present work, with the aim of improving scalability, we modify the original variational approach used with RGPs in order to enable inference via stochastic mini-batch optimization, giving rise to the Stochastic Recurrent Variational Bayes (S-REVARB) framework. We review recent related literature and comprehensively contextualize it with our approach. Moreover, we propose two learning procedures, the Local and Global S-REVARB algorithms, which prevent computational costs from scaling with the number of training samples. The global variant permits even greater scalability by also preventing the number of variational parameters from increasing with the training set, through the use of neural networks as sequential recognition models. The proposed framework is evaluated in the task of dynamical system identification for large scale datasets, a scenario not readily supported by the standard batch inference for RGPs. The promising results indicate that the S-REVARB framework opens up the possibility of applying powerful hierarchical recurrent GP-based models to massive sequential data.

摘要

高斯过程 (Gaussian Processes, GPs) 模型已成功应用于从序贯观测中学习的问题。在这种情况下,最近引入了递归高斯过程 (Recurrent Gaussian Processes, RGPs) 家族,其具有专门设计的结构来处理动态数据。然而,RGPs 存在大多数 GP 方法共有的局限性:当面对非常大的数据集时,它们的计算变得不可行。在本工作中,为了提高可扩展性,我们修改了 RGPs 中使用的原始变分方法,以便通过随机小批量优化进行推断,从而产生随机递归变分贝叶斯 (Stochastic Recurrent Variational Bayes, S-REVARB) 框架。我们回顾了最近的相关文献,并将其与我们的方法进行了全面的对比。此外,我们提出了两种学习过程,即局部和全局 S-REVARB 算法,它们通过使用神经网络作为序列识别模型,防止计算成本随训练样本数量的增加而增加。所提出的框架在大规模数据集的动态系统识别任务中进行了评估,这是 RGPs 的标准批量推断不易支持的场景。有前景的结果表明,S-REVARB 框架为应用基于强大分层递归 GP 的模型处理海量序贯数据提供了可能性。

相似文献

1
A stochastic variational framework for Recurrent Gaussian Processes models.循环高斯过程模型的随机变分框架。
Neural Netw. 2019 Apr;112:54-72. doi: 10.1016/j.neunet.2019.01.005. Epub 2019 Feb 1.
2
Stochastic complexities of general mixture models in variational Bayesian learning.变分贝叶斯学习中一般混合模型的随机复杂性
Neural Netw. 2007 Mar;20(2):210-9. doi: 10.1016/j.neunet.2006.05.030. Epub 2006 Aug 10.
3
Unsupervised learning of gaussian mixtures based on variational component splitting.基于变分分量分裂的高斯混合无监督学习。
IEEE Trans Neural Netw. 2007 May;18(3):745-55. doi: 10.1109/TNN.2006.891114.
4
Variational learning in nonlinear gaussian belief networks.非线性高斯信念网络中的变分学习
Neural Comput. 1999 Jan 1;11(1):193-213. doi: 10.1162/089976699300016872.
5
Variational mean-field algorithm for efficient inference in large systems of stochastic differential equations.用于大型随机微分方程系统高效推断的变分平均场算法。
Phys Rev E Stat Nonlin Soft Matter Phys. 2015 Jan;91(1):012148. doi: 10.1103/PhysRevE.91.012148. Epub 2015 Jan 30.
6
Divergence measures and a general framework for local variational approximation.分歧测度与局部变分逼近的一般框架。
Neural Netw. 2011 Dec;24(10):1102-9. doi: 10.1016/j.neunet.2011.06.004. Epub 2011 Jun 15.
7
Stochastic learning via optimizing the variational inequalities.通过优化变分不等式进行随机学习。
IEEE Trans Neural Netw Learn Syst. 2014 Oct;25(10):1769-78. doi: 10.1109/TNNLS.2013.2294741.
8
Algebraic geometrical methods for hierarchical learning machines.用于分层学习机器的代数几何方法。
Neural Netw. 2001 Oct;14(8):1049-60. doi: 10.1016/s0893-6080(01)00069-7.
9
Regularized variational Bayesian learning of echo state networks with delay&sum readout.带延迟求和读出的回声状态网络正则化变分贝叶斯学习。
Neural Comput. 2012 Apr;24(4):967-95. doi: 10.1162/NECO_a_00253. Epub 2011 Dec 14.
10
Bayesian multitask classification with Gaussian process priors.具有高斯过程先验的贝叶斯多任务分类
IEEE Trans Neural Netw. 2011 Dec;22(12):2011-21. doi: 10.1109/TNN.2011.2168568. Epub 2011 Oct 10.