• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过局部细化对变分后验进行采样。

Sampling the Variational Posterior with Local Refinement.

作者信息

Havasi Marton, Snoek Jasper, Tran Dustin, Gordon Jonathan, Hernández-Lobato José Miguel

机构信息

Department of Engineering, University of Cambridge, Cambridge CB2 1PZ, UK.

Brain Team, Google Research, Mountain View, CA 94043, USA.

出版信息

Entropy (Basel). 2021 Nov 8;23(11):1475. doi: 10.3390/e23111475.

DOI:10.3390/e23111475
PMID:34828173
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8621907/
Abstract

Variational inference is an optimization-based method for approximating the posterior distribution of the parameters in Bayesian probabilistic models. A key challenge of variational inference is to approximate the posterior with a distribution that is computationally tractable yet sufficiently expressive. We propose a novel method for generating samples from a highly flexible variational approximation. The method starts with a coarse initial approximation and generates samples by refining it in selected, local regions. This allows the samples to capture dependencies and multi-modality in the posterior, even when these are absent from the initial approximation. We demonstrate theoretically that our method always improves the quality of the approximation (as measured by the evidence lower bound). In experiments, our method consistently outperforms recent variational inference methods in terms of log-likelihood and ELBO across three example tasks: the Eight-Schools example (an inference task in a hierarchical model), training a ResNet-20 (Bayesian inference in a large neural network), and the Mushroom task (posterior sampling in a contextual bandit problem).

摘要

变分推断是一种基于优化的方法,用于逼近贝叶斯概率模型中参数的后验分布。变分推断的一个关键挑战是用一种计算上易于处理但又具有足够表达能力的分布来逼近后验分布。我们提出了一种从高度灵活的变分近似中生成样本的新方法。该方法从一个粗糙的初始近似开始,并通过在选定的局部区域对其进行细化来生成样本。这使得样本能够捕捉后验分布中的依赖性和多模态性,即使初始近似中不存在这些特性。我们从理论上证明了我们的方法总是能提高近似的质量(以证据下界衡量)。在实验中,在三个示例任务中,我们的方法在对数似然和证据下界方面始终优于最近的变分推断方法:八所学校示例(层次模型中的一个推断任务)、训练ResNet - 20(大型神经网络中的贝叶斯推断)以及蘑菇任务(上下文博弈问题中的后验采样)。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/c624aa11cb3d/entropy-23-01475-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/6d1cfa16ad97/entropy-23-01475-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/36801e463c8d/entropy-23-01475-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/27821b936d72/entropy-23-01475-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/a01d297fbbbd/entropy-23-01475-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/c624aa11cb3d/entropy-23-01475-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/6d1cfa16ad97/entropy-23-01475-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/36801e463c8d/entropy-23-01475-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/27821b936d72/entropy-23-01475-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/a01d297fbbbd/entropy-23-01475-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4026/8621907/c624aa11cb3d/entropy-23-01475-g005.jpg

相似文献

1
Sampling the Variational Posterior with Local Refinement.通过局部细化对变分后验进行采样。
Entropy (Basel). 2021 Nov 8;23(11):1475. doi: 10.3390/e23111475.
2
Gradient Regularization as Approximate Variational Inference.作为近似变分推断的梯度正则化
Entropy (Basel). 2021 Dec 3;23(12):1629. doi: 10.3390/e23121629.
3
Variationally Inferred Sampling through a Refined Bound.通过精细边界进行变分推断采样
Entropy (Basel). 2021 Jan 19;23(1):123. doi: 10.3390/e23010123.
4
Variational approximation error in non-negative matrix factorization.非负矩阵分解中的变分逼近误差。
Neural Netw. 2020 Jun;126:65-75. doi: 10.1016/j.neunet.2020.03.009. Epub 2020 Mar 13.
5
Efficient variational Bayesian approximation method based on subspace optimization.基于子空间优化的高效变分贝叶斯逼近方法。
IEEE Trans Image Process. 2015 Feb;24(2):681-93. doi: 10.1109/TIP.2014.2383321. Epub 2014 Dec 18.
6
Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation.通过雷尼熵界优化和多源适应的变分推理
Entropy (Basel). 2023 Oct 20;25(10):1468. doi: 10.3390/e25101468.
7
History Marginalization Improves Forecasting in Variational Recurrent Neural Networks.历史边缘化改进了变分递归神经网络中的预测。
Entropy (Basel). 2021 Nov 24;23(12):1563. doi: 10.3390/e23121563.
8
Differentiable samplers for deep latent variable models.深度潜变量模型的可微采样器。
Philos Trans A Math Phys Eng Sci. 2023 May 15;381(2247):20220147. doi: 10.1098/rsta.2022.0147. Epub 2023 Mar 27.
9
Advances in Variational Inference.变分推理的进展
IEEE Trans Pattern Anal Mach Intell. 2019 Aug;41(8):2008-2026. doi: 10.1109/TPAMI.2018.2889774. Epub 2018 Dec 25.
10
Bayesian Estimation of Inverted Beta Mixture Models With Extended Stochastic Variational Inference for Positive Vector Classification.用于正向量分类的具有扩展随机变分推理的逆贝塔混合模型的贝叶斯估计
IEEE Trans Neural Netw Learn Syst. 2024 May;35(5):6948-6962. doi: 10.1109/TNNLS.2022.3213518. Epub 2024 May 2.

本文引用的文献

1
PyMC: a modern, and comprehensive probabilistic programming framework in Python.PyMC:Python 中一个现代且全面的概率编程框架。
PeerJ Comput Sci. 2023 Sep 1;9:e1516. doi: 10.7717/peerj-cs.1516. eCollection 2023.
2
Stan: A Probabilistic Programming Language.斯坦:一种概率编程语言。
J Stat Softw. 2017;76. doi: 10.18637/jss.v076.i01. Epub 2017 Jan 11.
3
Advances in Variational Inference.变分推理的进展
IEEE Trans Pattern Anal Mach Intell. 2019 Aug;41(8):2008-2026. doi: 10.1109/TPAMI.2018.2889774. Epub 2018 Dec 25.