• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

论信息瓶颈与深度信息瓶颈之间的差异。

On the Difference between the Information Bottleneck and the Deep Information Bottleneck.

作者信息

Wieczorek Aleksander, Roth Volker

机构信息

Department of Mathematics and Computer Science, University of Basel, CH-4051 Basel, Switzerland.

出版信息

Entropy (Basel). 2020 Jan 22;22(2):131. doi: 10.3390/e22020131.

DOI:10.3390/e22020131
PMID:33285906
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7516540/
Abstract

Combining the information bottleneck model with deep learning by replacing mutual information terms with deep neural nets has proven successful in areas ranging from generative modelling to interpreting deep neural networks. In this paper, we revisit the deep variational information bottleneck and the assumptions needed for its derivation. The two assumed properties of the data, and , and their latent representation , take the form of two Markov chains T - X - Y and X - T - Y . Requiring both to hold during the optimisation process can be limiting for the set of potential joint distributions P ( X , Y , T ) . We, therefore, show how to circumvent this limitation by optimising a lower bound for the mutual information between and : I ( T ; Y ) , for which only the latter Markov chain has to be satisfied. The mutual information I ( T ; Y ) can be split into two non-negative parts. The first part is the lower bound for I ( T ; Y ) , which is optimised in deep variational information bottleneck (DVIB) and cognate models in practice. The second part consists of two terms that measure how much the former requirement T - X - Y is violated. Finally, we propose interpreting the family of information bottleneck models as directed graphical models, and show that in this framework, the original and deep information bottlenecks are special cases of a fundamental IB model.

摘要

通过用深度神经网络替换互信息项,将信息瓶颈模型与深度学习相结合,已在从生成建模到解释深度神经网络等多个领域取得成功。在本文中,我们重新审视深度变分信息瓶颈及其推导所需的假设。数据的两个假设属性 以及 及其潜在表示 ,采用两个马尔可夫链T - X - Y和X - T - Y的形式。要求在优化过程中两者都成立,对于潜在联合分布P(X, Y, T)的集合可能具有局限性。因此,我们展示了如何通过优化 和 之间互信息的下界:I(T; Y)来规避这一限制,对于该下界,只需满足后一个马尔可夫链即可。互信息I(T; Y)可以拆分为两个非负部分。第一部分是I(T; Y)的下界,在深度变分信息瓶颈(DVIB)及相关模型中实际进行了优化。第二部分由两个项组成,用于衡量前一个要求T - X - Y被违反的程度。最后,我们建议将信息瓶颈模型家族解释为有向图模型,并表明在这个框架中,原始信息瓶颈和深度信息瓶颈是基本信息瓶颈模型的特殊情况。

相似文献

1
On the Difference between the Information Bottleneck and the Deep Information Bottleneck.论信息瓶颈与深度信息瓶颈之间的差异。
Entropy (Basel). 2020 Jan 22;22(2):131. doi: 10.3390/e22020131.
2
Information Bottleneck Analysis by a Conditional Mutual Information Bound.基于条件互信息界的信息瓶颈分析。
Entropy (Basel). 2021 Jul 29;23(8):974. doi: 10.3390/e23080974.
3
Utilizing Information Bottleneck to Evaluate the Capability of Deep Neural Networks for Image Classification.利用信息瓶颈评估深度神经网络的图像分类能力。
Entropy (Basel). 2019 May 1;21(5):456. doi: 10.3390/e21050456.
4
Nonlinear quality-related fault detection using combined deep variational information bottleneck and variational autoencoder.基于深度变分信息瓶颈与变分自编码器相结合的非线性质量相关故障检测
ISA Trans. 2021 Aug;114:444-454. doi: 10.1016/j.isatra.2021.01.002. Epub 2021 Jan 11.
5
Variational Information Bottleneck for Unsupervised Clustering: Deep Gaussian Mixture Embedding.
Entropy (Basel). 2020 Feb 13;22(2):213. doi: 10.3390/e22020213.
6
CCGIB: A Cross-Channel Graph Information Bottleneck Principle.
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):9488-9499. doi: 10.1109/TNNLS.2024.3413750. Epub 2025 May 2.
7
The Convex Information Bottleneck Lagrangian.凸信息瓶颈拉格朗日函数。
Entropy (Basel). 2020 Jan 14;22(1):98. doi: 10.3390/e22010098.
8
Variational Information Bottleneck for Semi-Supervised Classification.用于半监督分类的变分信息瓶颈
Entropy (Basel). 2020 Aug 27;22(9):943. doi: 10.3390/e22090943.
9
A Comparison of Variational Bounds for the Information Bottleneck Functional.信息瓶颈泛函变分界的比较
Entropy (Basel). 2020 Oct 29;22(11):1229. doi: 10.3390/e22111229.
10
Bottleneck Problems: An Information and Estimation-Theoretic View.瓶颈问题:信息与估计理论视角
Entropy (Basel). 2020 Nov 20;22(11):1325. doi: 10.3390/e22111325.

引用本文的文献

1
Information Bottleneck Analysis by a Conditional Mutual Information Bound.基于条件互信息界的信息瓶颈分析。
Entropy (Basel). 2021 Jul 29;23(8):974. doi: 10.3390/e23080974.
2
A Comparison of Variational Bounds for the Information Bottleneck Functional.信息瓶颈泛函变分界的比较
Entropy (Basel). 2020 Oct 29;22(11):1229. doi: 10.3390/e22111229.
3
Information Bottleneck for Estimating Treatment Effects with Systematically Missing Covariates.用于估计存在系统性缺失协变量时治疗效果的信息瓶颈

本文引用的文献

1
Information Dropout: Learning Optimal Representations Through Noisy Computation.信息丢失:通过噪声计算学习最优表示
IEEE Trans Pattern Anal Mach Intell. 2018 Dec;40(12):2897-2905. doi: 10.1109/TPAMI.2017.2784440. Epub 2018 Jan 10.
Entropy (Basel). 2020 Mar 29;22(4):389. doi: 10.3390/e22040389.
4
On the Information Bottleneck Problems: Models, Connections, Applications and Information Theoretic Views.关于信息瓶颈问题:模型、联系、应用及信息论观点
Entropy (Basel). 2020 Jan 27;22(2):151. doi: 10.3390/e22020151.