Suppr超能文献

使用贝叶斯递归神经网络对随机时间序列进行概率逼近

Probabilistic Approximation of Stochastic Time Series Using Bayesian Recurrent Neural Network.

作者信息

Wang Yuhang, Cai Kaiquan, Meng Deyuan

出版信息

IEEE Trans Neural Netw Learn Syst. 2025 Jun;36(6):11664-11671. doi: 10.1109/TNNLS.2025.3529995.

Abstract

In this brief, we investigate the approximation theory (AT) of Bayesian recurrent neural network (BRNN) for stochastic time series forecasting (TSF) from a probabilistic standpoint. Due to the cumulative dependencies present in stochastic time series, which are incompatible with the recurrent structure of BRNN and further complicate the analysis of AT, we first perform marginalization and transform the time series into a probabilistically equivalent latent variable model (LVM). Subsequently, we analyze the AT by evaluating the approximation error between the output mean of BRNN and that of the LVM, which are derived through Taylor expansion-based uncertainty propagation and distribution parameterization, respectively. Finally, leveraging the Khinchin's law of large numbers, we study the convergence in probability of the sampling-based training algorithm, i.e., Bayes by Backprop (BBB), and prove that increasing the number of Monte Carlo samples in BBB leads to a convergence probability approaching one. Numerical simulations are conducted to demonstrate the validity of our results.

摘要

在本简报中,我们从概率角度研究贝叶斯递归神经网络(BRNN)用于随机时间序列预测(TSF)的逼近理论(AT)。由于随机时间序列中存在累积依赖性,这与BRNN的递归结构不兼容,并进一步使AT的分析复杂化,我们首先进行边缘化操作,将时间序列转换为概率等价的潜在变量模型(LVM)。随后,我们通过评估BRNN的输出均值与LVM的输出均值之间的逼近误差来分析AT,这两个均值分别通过基于泰勒展开的不确定性传播和分布参数化得到。最后,利用辛钦大数定律,我们研究基于采样的训练算法(即反向传播贝叶斯(BBB))的概率收敛性,并证明在BBB中增加蒙特卡罗样本数量会导致收敛概率趋近于1。进行了数值模拟以证明我们结果的有效性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验