• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

信息瓶颈泛函变分界的比较

A Comparison of Variational Bounds for the Information Bottleneck Functional.

作者信息

Geiger Bernhard C, Fischer Ian S

机构信息

Know-Center GmbH, Inffeldgasse 13/6, 8010 Graz, Austria.

Google Research, Mountain View, CA 94043, USA.

出版信息

Entropy (Basel). 2020 Oct 29;22(11):1229. doi: 10.3390/e22111229.

DOI:10.3390/e22111229
PMID:33286997
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7712881/
Abstract

In this short note, we relate the variational bounds proposed in Alemi et al. (2017) and Fischer (2020) for the information bottleneck (IB) and the conditional entropy bottleneck (CEB) functional, respectively. Although the two functionals were shown to be equivalent, it was empirically observed that optimizing bounds on the CEB functional achieves better generalization performance and adversarial robustness than optimizing those on the IB functional. This work tries to shed light on this issue by showing that, in the most general setting, no ordering can be established between these variational bounds, while such an ordering can be enforced by restricting the feasible sets over which the optimizations take place. The absence of such an ordering in the general setup suggests that the variational bound on the CEB functional is either more amenable to optimization or a relevant cost function for optimization in its own regard, i.e., without justification from the IB or CEB functionals.

摘要

在本短文里,我们分别关联了阿莱米等人(2017年)和菲舍尔(2020年)针对信息瓶颈(IB)和条件熵瓶颈(CEB)泛函所提出的变分界。尽管已证明这两个泛函是等价的,但从经验上观察到,优化CEB泛函的界比优化IB泛函的界能实现更好的泛化性能和对抗鲁棒性。这项工作试图通过表明在最一般的设定下,无法在这些变分界之间建立排序关系,而通过限制进行优化的可行集可以强制建立这样的排序关系,来阐明这个问题。在一般设置中不存在这样的排序关系,这表明CEB泛函的变分界要么更易于优化,要么就其本身而言是一个用于优化的相关成本函数,即无需从IB或CEB泛函进行论证。

相似文献

1
A Comparison of Variational Bounds for the Information Bottleneck Functional.信息瓶颈泛函变分界的比较
Entropy (Basel). 2020 Oct 29;22(11):1229. doi: 10.3390/e22111229.
2
The Conditional Entropy Bottleneck.条件熵瓶颈
Entropy (Basel). 2020 Sep 8;22(9):999. doi: 10.3390/e22090999.
3
CEB Improves Model Robustness.CEB提高模型鲁棒性。
Entropy (Basel). 2020 Sep 25;22(10):1081. doi: 10.3390/e22101081.
4
CCGIB: A Cross-Channel Graph Information Bottleneck Principle.
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):9488-9499. doi: 10.1109/TNNLS.2024.3413750. Epub 2025 May 2.
5
On the Difference between the Information Bottleneck and the Deep Information Bottleneck.论信息瓶颈与深度信息瓶颈之间的差异。
Entropy (Basel). 2020 Jan 22;22(2):131. doi: 10.3390/e22020131.
6
Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle.使用信息瓶颈原理学习基于神经网络的分类表示。
IEEE Trans Pattern Anal Mach Intell. 2020 Sep;42(9):2225-2239. doi: 10.1109/TPAMI.2019.2909031. Epub 2019 Apr 2.
7
The Deterministic Information Bottleneck.确定性信息瓶颈
Neural Comput. 2017 Jun;29(6):1611-1630. doi: 10.1162/NECO_a_00961. Epub 2017 Apr 14.
8
Adversarial Training With Anti-Adversaries.
IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):10210-10227. doi: 10.1109/TPAMI.2024.3432973. Epub 2024 Nov 6.
9
Adversarial Information Bottleneck.对抗性信息瓶颈
IEEE Trans Neural Netw Learn Syst. 2022 May 20;PP. doi: 10.1109/TNNLS.2022.3172986.
10
Variational Information Bottleneck for Semi-Supervised Classification.用于半监督分类的变分信息瓶颈
Entropy (Basel). 2020 Aug 27;22(9):943. doi: 10.3390/e22090943.

引用本文的文献

1
The Supervised Information Bottleneck.监督信息瓶颈
Entropy (Basel). 2025 Apr 22;27(5):452. doi: 10.3390/e27050452.
2
Partial Information Decomposition: Redundancy as Information Bottleneck.部分信息分解:作为信息瓶颈的冗余度
Entropy (Basel). 2024 Jun 26;26(7):546. doi: 10.3390/e26070546.
3
Information Bottleneck Analysis by a Conditional Mutual Information Bound.基于条件互信息界的信息瓶颈分析。
Entropy (Basel). 2021 Jul 29;23(8):974. doi: 10.3390/e23080974.
4
Information Bottleneck: Theory and Applications in Deep Learning.信息瓶颈:深度学习中的理论与应用
Entropy (Basel). 2020 Dec 14;22(12):1408. doi: 10.3390/e22121408.

本文引用的文献

1
CEB Improves Model Robustness.CEB提高模型鲁棒性。
Entropy (Basel). 2020 Sep 25;22(10):1081. doi: 10.3390/e22101081.
2
The Conditional Entropy Bottleneck.条件熵瓶颈
Entropy (Basel). 2020 Sep 8;22(9):999. doi: 10.3390/e22090999.
3
On the Difference between the Information Bottleneck and the Deep Information Bottleneck.论信息瓶颈与深度信息瓶颈之间的差异。
Entropy (Basel). 2020 Jan 22;22(2):131. doi: 10.3390/e22020131.
4
Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle.使用信息瓶颈原理学习基于神经网络的分类表示。
IEEE Trans Pattern Anal Mach Intell. 2020 Sep;42(9):2225-2239. doi: 10.1109/TPAMI.2019.2909031. Epub 2019 Apr 2.
5
Information Dropout: Learning Optimal Representations Through Noisy Computation.信息丢失:通过噪声计算学习最优表示
IEEE Trans Pattern Anal Mach Intell. 2018 Dec;40(12):2897-2905. doi: 10.1109/TPAMI.2017.2784440. Epub 2018 Jan 10.