• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

变分贝叶斯近似(VBA):不同优化算法的实现与比较

Variational Bayesian Approximation (VBA): Implementation and Comparison of Different Optimization Algorithms.

作者信息

Fallah Mortezanejad Seyedeh Azadeh, Mohammad-Djafari Ali

机构信息

School of Automotive and Traffic Engineering, Jiangsu University, Zhenjiang 212013, China.

International Science Consulting and Training (ISCT), 91440 Bures sur Yvette, France.

出版信息

Entropy (Basel). 2024 Aug 20;26(8):707. doi: 10.3390/e26080707.

DOI:10.3390/e26080707
PMID:39202177
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11353284/
Abstract

In any Bayesian computations, the first step is to derive the joint distribution of all the unknown variables given the observed data. Then, we have to do the computations. There are four general methods for performing computations: Joint MAP optimization; Posterior expectation computations that require integration methods; Sampling-based methods, such as MCMC, slice sampling, nested sampling, etc., for generating samples and numerically computing expectations; and finally, Variational Bayesian Approximation (VBA). In this last method, which is the focus of this paper, the objective is to search for an approximation for the joint posterior with a simpler one that allows for analytical computations. The main tool in VBA is to use the Kullback-Leibler Divergence (KLD) as a criterion to obtain that approximation. Even if, theoretically, this can be conducted formally, for practical reasons, we consider the case where the joint distribution is in the exponential family, and so is its approximation. In this case, the KLD becomes a function of the usual parameters or the natural parameters of the exponential family, where the problem becomes parametric optimization. Thus, we compare four optimization algorithms: general alternate functional optimization; parametric gradient-based with the normal and natural parameters; and the natural gradient algorithm. We then study their relative performances on three examples to demonstrate the implementation of each algorithm and their efficiency performance.

摘要

在任何贝叶斯计算中,第一步是在给定观测数据的情况下推导所有未知变量的联合分布。然后,我们必须进行计算。有四种进行计算的通用方法:联合最大后验概率(MAP)优化;需要积分方法的后验期望计算;基于采样的方法,如马尔可夫链蒙特卡罗(MCMC)、切片采样、嵌套采样等,用于生成样本并数值计算期望;最后是变分贝叶斯近似(VBA)。在本文重点关注的最后一种方法中,目标是用一个更简单的分布来搜索联合后验的近似,以便进行解析计算。VBA的主要工具是使用库尔贝克 - 莱布勒散度(KLD)作为标准来获得该近似。即使从理论上讲这可以正式进行,但出于实际原因,我们考虑联合分布属于指数族且其近似也属于指数族的情况。在这种情况下,KLD成为指数族通常参数或自然参数的函数,此时问题就变成了参数优化。因此,我们比较四种优化算法:一般交替函数优化;基于参数梯度的普通参数和自然参数优化;以及自然梯度算法。然后,我们在三个例子上研究它们的相对性能,以展示每种算法的实现及其效率表现。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/2b9448046177/entropy-26-00707-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/2d6354aad970/entropy-26-00707-g001a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/42647786a570/entropy-26-00707-g002a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/b9e568ef1f4a/entropy-26-00707-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/3864f34b71ad/entropy-26-00707-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/2b9448046177/entropy-26-00707-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/2d6354aad970/entropy-26-00707-g001a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/42647786a570/entropy-26-00707-g002a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/b9e568ef1f4a/entropy-26-00707-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/3864f34b71ad/entropy-26-00707-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7f7e/11353284/2b9448046177/entropy-26-00707-g005.jpg

相似文献

1
Variational Bayesian Approximation (VBA): Implementation and Comparison of Different Optimization Algorithms.变分贝叶斯近似(VBA):不同优化算法的实现与比较
Entropy (Basel). 2024 Aug 20;26(8):707. doi: 10.3390/e26080707.
2
Precise periodic components estimation for chronobiological signals through Bayesian Inference with sparsity enforcing prior.通过具有稀疏性增强先验的贝叶斯推理对生物钟信号进行精确的周期性成分估计。
EURASIP J Bioinform Syst Biol. 2016 Jan 20;2016(1):3. doi: 10.1186/s13637-015-0033-6. eCollection 2016 Dec.
3
Efficient variational Bayesian approximation method based on subspace optimization.基于子空间优化的高效变分贝叶斯逼近方法。
IEEE Trans Image Process. 2015 Feb;24(2):681-93. doi: 10.1109/TIP.2014.2383321. Epub 2014 Dec 18.
4
Variational Bayesian Algorithms for Maneuvering Target Tracking with Nonlinear Measurements in Sensor Networks.传感器网络中基于非线性测量的机动目标跟踪变分贝叶斯算法
Entropy (Basel). 2023 Aug 18;25(8):1235. doi: 10.3390/e25081235.
5
Variational approximation error in non-negative matrix factorization.非负矩阵分解中的变分逼近误差。
Neural Netw. 2020 Jun;126:65-75. doi: 10.1016/j.neunet.2020.03.009. Epub 2020 Mar 13.
6
A Generic Formula and Some Special Cases for the Kullback-Leibler Divergence between Central Multivariate Cauchy Distributions.中心多元柯西分布之间库尔贝克-莱布勒散度的通用公式及一些特殊情况。
Entropy (Basel). 2022 Jun 17;24(6):838. doi: 10.3390/e24060838.
7
An Iterative Nonlinear Filter Using Variational Bayesian Optimization.一种基于变分贝叶斯优化的迭代非线性滤波器。
Sensors (Basel). 2018 Dec 1;18(12):4222. doi: 10.3390/s18124222.
8
Applications of a Kullback-Leibler Divergence for Comparing Non-nested Models.用于比较非嵌套模型的库尔贝克-莱布勒散度的应用。
Stat Modelling. 2013 Dec;13(5-6):409-429. doi: 10.1177/1471082X13494610.
9
Sampling the Variational Posterior with Local Refinement.通过局部细化对变分后验进行采样。
Entropy (Basel). 2021 Nov 8;23(11):1475. doi: 10.3390/e23111475.
10
A Kullback-Leibler Divergence for Bayesian Model Diagnostics.用于贝叶斯模型诊断的库尔贝克-莱布勒散度。
Open J Stat. 2011 Oct;1(3):172-184. doi: 10.4236/ojs.2011.13021.

本文引用的文献

1
Efficient variational Bayesian approximation method based on subspace optimization.基于子空间优化的高效变分贝叶斯逼近方法。
IEEE Trans Image Process. 2015 Feb;24(2):681-93. doi: 10.1109/TIP.2014.2383321. Epub 2014 Dec 18.
2
Joint NDT image restoration and segmentation using Gauss-Markov-Potts prior models and variational Bayesian computation.基于高斯-马尔可夫-泊松先验模型和变分贝叶斯计算的联合无损检测图像恢复和分割。
IEEE Trans Image Process. 2010 Sep;19(9):2265-77. doi: 10.1109/TIP.2010.2047902. Epub 2010 Apr 8.
3
The AIC criterion and symmetrizing the Kullback-Leibler divergence.
赤池信息准则与对称化库尔贝克-莱布勒散度
IEEE Trans Neural Netw. 2007 Jan;18(1):97-106. doi: 10.1109/TNN.2006.882813.
4
Applications of Bayesian statistical methods in microarray data analysis.贝叶斯统计方法在微阵列数据分析中的应用。
Am J Pharmacogenomics. 2004;4(1):53-62. doi: 10.2165/00129785-200404010-00006.
5
Fast curvature matrix-vector products for second-order gradient descent.用于二阶梯度下降的快速曲率矩阵-向量积
Neural Comput. 2002 Jul;14(7):1723-38. doi: 10.1162/08997660260028683.