• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

GAD-PVI:一种基于粒子的通用加速动态加权变分推理框架。

GAD-PVI: A General Accelerated Dynamic-Weight Particle-Based Variational Inference Framework.

作者信息

Wang Fangyikang, Zhu Huminhao, Zhang Chao, Zhao Hanbin, Qian Hui

机构信息

College of Computer Science and Technology, Zhejiang University, Hangzhou 310058, China.

出版信息

Entropy (Basel). 2024 Aug 11;26(8):679. doi: 10.3390/e26080679.

DOI:10.3390/e26080679
PMID:39202149
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11354113/
Abstract

Particle-based Variational Inference (ParVI) methods have been widely adopted in deep Bayesian inference tasks such as Bayesian neural networks or Gaussian Processes, owing to their efficiency in generating high-quality samples given the score of the target distribution. Typically, ParVI methods evolve a weighted-particle system by approximating the first-order Wasserstein gradient flow to reduce the dissimilarity between the particle system's empirical distribution and the target distribution. Recent advancements in ParVI have explored sophisticated gradient flows to obtain refined particle systems with either accelerated position updates or dynamic weight adjustments. In this paper, we introduce the semi-Hamiltonian gradient flow on a novel Information-Fisher-Rao space, known as the SHIFR flow, and propose the first ParVI framework that possesses both accelerated position update and dynamical weight adjustment simultaneously, named the General Accelerated Dynamic-Weight Particle-based Variational Inference (GAD-PVI) framework. GAD-PVI is compatible with different dissimilarities between the empirical distribution and the target distribution, as well as different approximation approaches to gradient flow. Moreover, when the appropriate dissimilarity is selected, GAD-PVI is also suitable for obtaining high-quality samples even when analytical scores cannot be obtained. Experiments conducted under both the score-based tasks and sample-based tasks demonstrate the faster convergence and reduced approximation error of GAD-PVI methods over the state-of-the-art.

摘要

基于粒子的变分推理(ParVI)方法已在深度贝叶斯推理任务(如贝叶斯神经网络或高斯过程)中被广泛采用,这是因为在给定目标分布得分的情况下,它们在生成高质量样本方面具有效率。通常,ParVI方法通过逼近一阶瓦瑟斯坦梯度流来演化加权粒子系统,以减少粒子系统的经验分布与目标分布之间的差异。ParVI的最新进展探索了复杂的梯度流,以获得具有加速位置更新或动态权重调整的精细粒子系统。在本文中,我们在一个名为SHIFR流的新型信息-费希尔-拉奥空间上引入半哈密顿梯度流,并提出了第一个同时具备加速位置更新和动态权重调整的ParVI框架,即广义加速动态权重基于粒子的变分推理(GAD-PVI)框架。GAD-PVI与经验分布和目标分布之间的不同差异以及梯度流的不同近似方法兼容。此外,当选择合适的差异时,即使无法获得解析得分,GAD-PVI也适用于获得高质量样本。在基于得分的任务和基于样本的任务下进行的实验表明,GAD-PVI方法比现有技术具有更快的收敛速度和更低的近似误差。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/44bfa8f7300b/entropy-26-00679-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/ea20ba19cf56/entropy-26-00679-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/fcfb2db7d515/entropy-26-00679-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/29ded62d23d2/entropy-26-00679-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/d7f3ab77ddee/entropy-26-00679-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/1c5e8c4025f2/entropy-26-00679-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/9287aab150ee/entropy-26-00679-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/ed4456fa83f7/entropy-26-00679-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/44bfa8f7300b/entropy-26-00679-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/ea20ba19cf56/entropy-26-00679-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/fcfb2db7d515/entropy-26-00679-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/29ded62d23d2/entropy-26-00679-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/d7f3ab77ddee/entropy-26-00679-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/1c5e8c4025f2/entropy-26-00679-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/9287aab150ee/entropy-26-00679-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/ed4456fa83f7/entropy-26-00679-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/916c/11354113/44bfa8f7300b/entropy-26-00679-g005.jpg

相似文献

1
GAD-PVI: A General Accelerated Dynamic-Weight Particle-Based Variational Inference Framework.GAD-PVI:一种基于粒子的通用加速动态加权变分推理框架。
Entropy (Basel). 2024 Aug 11;26(8):679. doi: 10.3390/e26080679.
2
Sampling the Variational Posterior with Local Refinement.通过局部细化对变分后验进行采样。
Entropy (Basel). 2021 Nov 8;23(11):1475. doi: 10.3390/e23111475.
3
Flexible and Efficient Inference with Particles for the Variational Gaussian Approximation.用于变分高斯近似的基于粒子的灵活高效推理
Entropy (Basel). 2021 Jul 30;23(8):990. doi: 10.3390/e23080990.
4
Neural Operator Variational Inference Based on Regularized Stein Discrepancy for Deep Gaussian Processes.基于正则化斯坦差异的深度高斯过程的神经算子变分推理
IEEE Trans Neural Netw Learn Syst. 2025 Apr;36(4):6723-6737. doi: 10.1109/TNNLS.2024.3406635. Epub 2025 Apr 4.
5
A Geometric Variational Approach to Bayesian Inference.一种用于贝叶斯推理的几何变分方法。
J Am Stat Assoc. 2020;115(530):822-835. doi: 10.1080/01621459.2019.1585253. Epub 2019 Apr 30.
6
Variational Hamiltonian Monte Carlo via Score Matching.通过得分匹配的变分哈密顿蒙特卡罗方法
Bayesian Anal. 2018 Jun;13(2):485-506. doi: 10.1214/17-ba1060. Epub 2017 Jul 25.
7
Efficient variational Bayesian approximation method based on subspace optimization.基于子空间优化的高效变分贝叶斯逼近方法。
IEEE Trans Image Process. 2015 Feb;24(2):681-93. doi: 10.1109/TIP.2014.2383321. Epub 2014 Dec 18.
8
Stein Variational Gradient Descent with Matrix-Valued Kernels.带矩阵值核的斯坦因变分梯度下降法
Adv Neural Inf Process Syst. 2019 Dec;32:7834-7844.
9
Scalable Moment Propagation and Analysis of Variational Distributions for Practical Bayesian Deep Learning.用于实用贝叶斯深度学习的可扩展矩传播与变分分布分析
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):4614-4624. doi: 10.1109/TNNLS.2024.3367363. Epub 2025 Feb 28.
10
Neural Dynamics under Active Inference: Plausibility and Efficiency of Information Processing.主动推断下的神经动力学:信息处理的合理性与效率
Entropy (Basel). 2021 Apr 12;23(4):454. doi: 10.3390/e23040454.

本文引用的文献

1
Flexible and Efficient Inference with Particles for the Variational Gaussian Approximation.用于变分高斯近似的基于粒子的灵活高效推理
Entropy (Basel). 2021 Jul 30;23(8):990. doi: 10.3390/e23080990.
2
Geometric Variational Inference.几何变分推理
Entropy (Basel). 2021 Jul 2;23(7):853. doi: 10.3390/e23070853.
3
Extended Variational Message Passing for Automated Approximate Bayesian Inference.用于自动近似贝叶斯推理的扩展变分消息传递
Entropy (Basel). 2021 Jun 26;23(7):815. doi: 10.3390/e23070815.
4
Principles of Bayesian Inference Using General Divergence Criteria.使用一般散度准则的贝叶斯推断原理。
Entropy (Basel). 2018 Jun 6;20(6):442. doi: 10.3390/e20060442.
5
Reconstructing Perceived Images From Human Brain Activities With Bayesian Deep Multiview Learning.基于贝叶斯深度学习多视图重建人类大脑活动的感知图像。
IEEE Trans Neural Netw Learn Syst. 2019 Aug;30(8):2310-2323. doi: 10.1109/TNNLS.2018.2882456. Epub 2018 Dec 12.
6
Nonparametric Bayesian Correlated Group Regression With Applications to Image Classification.应用于图像分类的非参数贝叶斯相关组回归
IEEE Trans Neural Netw Learn Syst. 2018 Nov;29(11):5330-5344. doi: 10.1109/TNNLS.2018.2797539. Epub 2018 Feb 20.
7
A variational perspective on accelerated methods in optimization.优化中加速方法的变分视角。
Proc Natl Acad Sci U S A. 2016 Nov 22;113(47):E7351-E7358. doi: 10.1073/pnas.1614734113. Epub 2016 Nov 9.
8
Variational Bayesian Inference Algorithms for Infinite Relational Model of Network Data.网络数据无限关系模型的变分贝叶斯推理算法。
IEEE Trans Neural Netw Learn Syst. 2015 Sep;26(9):2176-81. doi: 10.1109/TNNLS.2014.2362012. Epub 2014 Oct 28.