• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

单次训练运行下探索回归变分信息瓶颈中的权衡

Exploring the Trade-Off in the Variational Information Bottleneck for Regression with a Single Training Run.

作者信息

Kudo Sota, Ono Naoaki, Kanaya Shigehiko, Huang Ming

机构信息

Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan.

Institute of Advanced Computing and Digital Engineering, Shenzhen Institute of Advanced Technology, Shenzhen 518055, China.

出版信息

Entropy (Basel). 2024 Nov 30;26(12):1043. doi: 10.3390/e26121043.

DOI:10.3390/e26121043
PMID:39766672
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11726874/
Abstract

An information bottleneck (IB) enables the acquisition of useful representations from data by retaining necessary information while reducing unnecessary information. In its objective function, the Lagrange multiplier β controls the trade-off between retention and reduction. This study analyzes the Variational Information Bottleneck (VIB), a standard IB method in deep learning, in the settings of regression problems and derives its optimal solution. Based on this analysis, we propose a framework for regression problems that can obtain the optimal solution of the VIB for all β values with a single training run. This is in contrast to conventional methods that require one training run for each β. The optimization performance of this framework is theoretically discussed and experimentally demonstrated. Our approach not only enhances the efficiency of exploring β in regression problems but also deepens the understanding of the IB's behavior and its effects in this setting.

摘要

信息瓶颈(IB)通过保留必要信息同时减少不必要信息,实现从数据中获取有用表示。在其目标函数中,拉格朗日乘数β控制着保留与减少之间的权衡。本研究在回归问题的背景下分析了变分信息瓶颈(VIB)——深度学习中的一种标准IB方法,并推导了其最优解。基于此分析,我们提出了一个用于回归问题的框架,该框架可以通过单次训练运行获得所有β值下VIB的最优解。这与传统方法形成对比,传统方法针对每个β都需要进行一次训练运行。从理论上讨论并通过实验证明了该框架的优化性能。我们的方法不仅提高了在回归问题中探索β的效率,还加深了对IB在此背景下的行为及其影响的理解。

相似文献

1
Exploring the Trade-Off in the Variational Information Bottleneck for Regression with a Single Training Run.单次训练运行下探索回归变分信息瓶颈中的权衡
Entropy (Basel). 2024 Nov 30;26(12):1043. doi: 10.3390/e26121043.
2
The Conditional Entropy Bottleneck.条件熵瓶颈
Entropy (Basel). 2020 Sep 8;22(9):999. doi: 10.3390/e22090999.
3
The Information Bottleneck's Ordinary Differential Equation: First-Order Root Tracking for the Information Bottleneck.信息瓶颈的常微分方程:信息瓶颈的一阶根跟踪
Entropy (Basel). 2023 Sep 22;25(10):1370. doi: 10.3390/e25101370.
4
Variational Information Bottleneck for Semi-Supervised Classification.用于半监督分类的变分信息瓶颈
Entropy (Basel). 2020 Aug 27;22(9):943. doi: 10.3390/e22090943.
5
The Convex Information Bottleneck Lagrangian.凸信息瓶颈拉格朗日函数。
Entropy (Basel). 2020 Jan 14;22(1):98. doi: 10.3390/e22010098.
6
Nonlinear quality-related fault detection using combined deep variational information bottleneck and variational autoencoder.基于深度变分信息瓶颈与变分自编码器相结合的非线性质量相关故障检测
ISA Trans. 2021 Aug;114:444-454. doi: 10.1016/j.isatra.2021.01.002. Epub 2021 Jan 11.
7
Fully Bayesian VIB-DeepSSM.全贝叶斯变分推理深度状态空间模型
Med Image Comput Comput Assist Interv. 2023 Oct;14222:346-356. doi: 10.1007/978-3-031-43898-1_34. Epub 2023 Oct 1.
8
A Comparison of Variational Bounds for the Information Bottleneck Functional.信息瓶颈泛函变分界的比较
Entropy (Basel). 2020 Oct 29;22(11):1229. doi: 10.3390/e22111229.
9
Variational Information Bottleneck Regularized Deep Reinforcement Learning for Efficient Robotic Skill Adaptation.变分信息瓶颈正则化深度强化学习在机器人高效技能自适应中的应用。
Sensors (Basel). 2023 Jan 9;23(2):762. doi: 10.3390/s23020762.
10
On the Difference between the Information Bottleneck and the Deep Information Bottleneck.论信息瓶颈与深度信息瓶颈之间的差异。
Entropy (Basel). 2020 Jan 22;22(2):131. doi: 10.3390/e22020131.

本文引用的文献

1
Information bottleneck theory of high-dimensional regression: relevancy, efficiency and optimality.高维回归的信息瓶颈理论:相关性、效率与最优性
Adv Neural Inf Process Syst. 2022 Dec;35:9784-9796.
2
The Conditional Entropy Bottleneck.条件熵瓶颈
Entropy (Basel). 2020 Sep 8;22(9):999. doi: 10.3390/e22090999.
3
The Convex Information Bottleneck Lagrangian.凸信息瓶颈拉格朗日函数。
Entropy (Basel). 2020 Jan 14;22(1):98. doi: 10.3390/e22010098.
4
Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle.使用信息瓶颈原理学习基于神经网络的分类表示。
IEEE Trans Pattern Anal Mach Intell. 2020 Sep;42(9):2225-2239. doi: 10.1109/TPAMI.2019.2909031. Epub 2019 Apr 2.
5
Information Dropout: Learning Optimal Representations Through Noisy Computation.信息丢失:通过噪声计算学习最优表示
IEEE Trans Pattern Anal Mach Intell. 2018 Dec;40(12):2897-2905. doi: 10.1109/TPAMI.2017.2784440. Epub 2018 Jan 10.
6
The Deterministic Information Bottleneck.确定性信息瓶颈
Neural Comput. 2017 Jun;29(6):1611-1630. doi: 10.1162/NECO_a_00961. Epub 2017 Apr 14.