• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过雷尼熵界优化和多源适应的变分推理

Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation.

作者信息

Zalman Oshri Dana, Fine Shai

机构信息

School of Computer Science, Reichman University, Herzliya 4610101, Israel.

Data Science Institute, Reichman University, Herzliya 4610101, Israel.

出版信息

Entropy (Basel). 2023 Oct 20;25(10):1468. doi: 10.3390/e25101468.

DOI:10.3390/e25101468
PMID:37895589
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10606691/
Abstract

Variational inference provides a way to approximate probability densities through optimization. It does so by optimizing an upper or a lower bound of the likelihood of the observed data (the evidence). The classic variational inference approach suggests maximizing the Evidence Lower Bound (ELBO). Recent studies proposed to optimize the variational Rényi bound (VR) and the χ upper bound. However, these estimates, which are based on the Monte Carlo (MC) approximation, either underestimate the bound or exhibit a high variance. In this work, we introduce a new upper bound, termed the Variational Rényi Log Upper bound (VRLU), which is based on the existing VR bound. In contrast to the existing VR bound, the MC approximation of the VRLU bound maintains the upper bound property. Furthermore, we devise a (sandwiched) upper-lower bound variational inference method, termed the Variational Rényi Sandwich (VRS), to jointly optimize the upper and lower bounds. We present a set of experiments, designed to evaluate the new VRLU bound and to compare the VRS method with the classic Variational Autoencoder (VAE) and the VR methods. Next, we apply the VRS approximation to the Multiple-Source Adaptation problem (MSA). MSA is a real-world scenario where data are collected from multiple sources that differ from one another by their probability distribution over the input space. The main aim is to combine fairly accurate predictive models from these sources and create an accurate model for new, mixed target domains. However, many domain adaptation methods assume prior knowledge of the data distribution in the source domains. In this work, we apply the suggested VRS density estimate to the Multiple-Source Adaptation problem (MSA) and show, both theoretically and empirically, that it provides tighter error bounds and improved performance, compared to leading MSA methods.

摘要

变分推断提供了一种通过优化来近似概率密度的方法。它通过优化观测数据(证据)似然性的上界或下界来实现这一点。经典的变分推断方法建议最大化证据下界(ELBO)。最近的研究提出优化变分雷尼界(VR)和χ上界。然而,这些基于蒙特卡罗(MC)近似的估计要么低估了界,要么表现出高方差。在这项工作中,我们引入了一个新的上界,称为变分雷尼对数上界(VRLU),它基于现有的VR界。与现有的VR界不同,VRLU界的MC近似保持了上界性质。此外,我们设计了一种(夹逼)上下界变分推断方法,称为变分雷尼夹逼(VRS),以联合优化上下界。我们展示了一组实验,旨在评估新的VRLU界,并将VRS方法与经典变分自编码器(VAE)和VR方法进行比较。接下来,我们将VRS近似应用于多源适应问题(MSA)。MSA是一种现实世界场景,其中数据是从多个源收集的,这些源在输入空间上的概率分布彼此不同。主要目标是将来自这些源的相当准确的预测模型组合起来,并为新的混合目标域创建一个准确的模型。然而,许多域适应方法假设源域中数据分布的先验知识。在这项工作中,我们将建议的VRS密度估计应用于多源适应问题(MSA),并在理论和实证上表明,与领先的MSA方法相比,它提供了更紧的误差界和更好的性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/ca90ec3f4530/entropy-25-01468-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/12b2f1474a42/entropy-25-01468-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/a4277f559c1a/entropy-25-01468-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/d9ced8e52b45/entropy-25-01468-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/c9f9352eea75/entropy-25-01468-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/dd7d9dfc107c/entropy-25-01468-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/975d8051e396/entropy-25-01468-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/2196bb0aff7b/entropy-25-01468-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/d568a0c5d2a9/entropy-25-01468-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/9ff99c958c7b/entropy-25-01468-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/70b1a9d3b5d4/entropy-25-01468-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/ca90ec3f4530/entropy-25-01468-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/12b2f1474a42/entropy-25-01468-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/a4277f559c1a/entropy-25-01468-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/d9ced8e52b45/entropy-25-01468-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/c9f9352eea75/entropy-25-01468-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/dd7d9dfc107c/entropy-25-01468-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/975d8051e396/entropy-25-01468-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/2196bb0aff7b/entropy-25-01468-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/d568a0c5d2a9/entropy-25-01468-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/9ff99c958c7b/entropy-25-01468-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/70b1a9d3b5d4/entropy-25-01468-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a72/10606691/ca90ec3f4530/entropy-25-01468-g011.jpg

相似文献

1
Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation.通过雷尼熵界优化和多源适应的变分推理
Entropy (Basel). 2023 Oct 20;25(10):1468. doi: 10.3390/e25101468.
2
Sampling the Variational Posterior with Local Refinement.通过局部细化对变分后验进行采样。
Entropy (Basel). 2021 Nov 8;23(11):1475. doi: 10.3390/e23111475.
3
Bayesian Brains and the Rényi Divergence.贝叶斯大脑与雷尼散度
Neural Comput. 2022 Mar 23;34(4):829-855. doi: 10.1162/neco_a_01484.
4
Variational approximation error in non-negative matrix factorization.非负矩阵分解中的变分逼近误差。
Neural Netw. 2020 Jun;126:65-75. doi: 10.1016/j.neunet.2020.03.009. Epub 2020 Mar 13.
5
Differentiable samplers for deep latent variable models.深度潜变量模型的可微采样器。
Philos Trans A Math Phys Eng Sci. 2023 May 15;381(2247):20220147. doi: 10.1098/rsta.2022.0147. Epub 2023 Mar 27.
6
A Robust Solution to Variational Importance Sampling of Minimum Variance.一种针对最小方差变分重要性采样的稳健解决方案。
Entropy (Basel). 2020 Dec 12;22(12):1405. doi: 10.3390/e22121405.
7
Model-Induced Generalization Error Bound for Information-Theoretic Representation Learning in Source-Data-Free Unsupervised Domain Adaptation.无源数据无监督域适应中信息论表示学习的模型诱导泛化误差界
IEEE Trans Image Process. 2022;31:419-432. doi: 10.1109/TIP.2021.3130530. Epub 2021 Dec 9.
8
Learn From Unpaired Data for Image Restoration: A Variational Bayes Approach.从非配对数据中学习进行图像恢复:一种变分贝叶斯方法。
IEEE Trans Pattern Anal Mach Intell. 2023 May;45(5):5889-5903. doi: 10.1109/TPAMI.2022.3215571. Epub 2023 Apr 3.
9
Variationally Inferred Sampling through a Refined Bound.通过精细边界进行变分推断采样
Entropy (Basel). 2021 Jan 19;23(1):123. doi: 10.3390/e23010123.
10
Variational Inference for Watson Mixture Model.沃森混合模型的变分推断。
IEEE Trans Pattern Anal Mach Intell. 2016 Sep;38(9):1886-900. doi: 10.1109/TPAMI.2015.2498935. Epub 2015 Nov 9.

本文引用的文献

1
Unsupervised Domain Adaptation With Variational Approximation for Cardiac Segmentation.基于变分近似的无监督域自适应的心脏分割。
IEEE Trans Med Imaging. 2021 Dec;40(12):3555-3567. doi: 10.1109/TMI.2021.3090412. Epub 2021 Nov 30.
2
Advances in Variational Inference.变分推理的进展
IEEE Trans Pattern Anal Mach Intell. 2019 Aug;41(8):2008-2026. doi: 10.1109/TPAMI.2018.2889774. Epub 2018 Dec 25.