• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

凸信息瓶颈拉格朗日函数。

The Convex Information Bottleneck Lagrangian.

作者信息

Rodríguez Gálvez Borja, Thobaben Ragnar, Skoglund Mikael

机构信息

Department of Intelligent Systems, Division of Information Science and Engineering (ISE), KTH Royal Institute of Technology, 11428 Stockholm, Sweden.

出版信息

Entropy (Basel). 2020 Jan 14;22(1):98. doi: 10.3390/e22010098.

DOI:10.3390/e22010098
PMID:33285873
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7516537/
Abstract

The information bottleneck (IB) problem tackles the issue of obtaining relevant compressed representations of some random variable for the task of predicting . It is defined as a constrained optimization problem that maximizes the information the representation has about the task, I ( T ; Y ) , while ensuring that a certain level of compression is achieved (i.e., I ( X ; T ) ≤ r ). For practical reasons, the problem is usually solved by maximizing the IB Lagrangian (i.e., L IB ( T ; β ) = I ( T ; Y ) - β I ( X ; T ) ) for many values of β ∈ [ 0 , 1 ] . Then, the curve of maximal I ( T ; Y ) for a given I ( X ; T ) is drawn and a representation with the desired predictability and compression is selected. It is known when is a deterministic function of , the IB curve cannot be explored and another Lagrangian has been proposed to tackle this problem: the squared IB Lagrangian: L sq - IB ( T ; β sq ) = I ( T ; Y ) - β sq I ( X ; T ) 2 . In this paper, we (i) present a general family of Lagrangians which allow for the exploration of the IB curve in all scenarios; (ii) provide the exact one-to-one mapping between the Lagrange multiplier and the desired compression rate for known IB curve shapes; and (iii) show we can approximately obtain a specific compression level with the convex IB Lagrangian for both known and unknown IB curve shapes. This eliminates the burden of solving the optimization problem for many values of the Lagrange multiplier. That is, we prove that we can solve the original constrained problem with a single optimization.

摘要

信息瓶颈(IB)问题解决了在预测任务中获取某个随机变量的相关压缩表示的问题。它被定义为一个约束优化问题,该问题在确保达到一定压缩水平(即(I(X;T) \leq r))的同时,最大化表示关于任务的信息(I(T;Y))。出于实际原因,通常通过针对许多(\beta \in [0,1])的值最大化IB拉格朗日函数(即(L_{IB}(T;\beta)=I(T;Y)-\beta I(X;T)))来解决该问题。然后,绘制给定(I(X;T))时最大(I(T;Y))的曲线,并选择具有所需可预测性和压缩性的表示。已知当(T)是(X)的确定性函数时,无法探索IB曲线,因此提出了另一种拉格朗日函数来解决此问题:平方IB拉格朗日函数:(L_{sq - IB}(T;\beta_{sq}) = I(T;Y)-\beta_{sq}I(X;T)^2)。在本文中,我们(i)提出了一族通用的拉格朗日函数,它们允许在所有情况下探索IB曲线;(ii)为已知的IB曲线形状提供拉格朗日乘子与所需压缩率之间的确切一一映射;(iii)表明对于已知和未知的IB曲线形状,我们都可以使用凸IB拉格朗日函数近似获得特定的压缩水平。这消除了针对拉格朗日乘子的许多值求解优化问题的负担。也就是说,我们证明了可以通过单次优化来解决原始的约束问题。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/e52c5b973723/entropy-22-00098-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/802472287bd2/entropy-22-00098-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/fd4f2f4feeba/entropy-22-00098-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/58d647aaec4b/entropy-22-00098-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/e2c83e78709c/entropy-22-00098-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/f944d22bb4ab/entropy-22-00098-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/cc878ca7501d/entropy-22-00098-g0A6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/cdf4e8cf77aa/entropy-22-00098-g0A7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/56484a2eddd7/entropy-22-00098-g0A8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/a43bcbac408c/entropy-22-00098-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/51d16943b296/entropy-22-00098-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/e3336512d227/entropy-22-00098-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/e52c5b973723/entropy-22-00098-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/802472287bd2/entropy-22-00098-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/fd4f2f4feeba/entropy-22-00098-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/58d647aaec4b/entropy-22-00098-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/e2c83e78709c/entropy-22-00098-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/f944d22bb4ab/entropy-22-00098-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/cc878ca7501d/entropy-22-00098-g0A6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/cdf4e8cf77aa/entropy-22-00098-g0A7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/56484a2eddd7/entropy-22-00098-g0A8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/a43bcbac408c/entropy-22-00098-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/51d16943b296/entropy-22-00098-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/e3336512d227/entropy-22-00098-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/21a1/7516537/e52c5b973723/entropy-22-00098-g004.jpg

相似文献

1
The Convex Information Bottleneck Lagrangian.凸信息瓶颈拉格朗日函数。
Entropy (Basel). 2020 Jan 14;22(1):98. doi: 10.3390/e22010098.
2
Adversarial Information Bottleneck.对抗性信息瓶颈
IEEE Trans Neural Netw Learn Syst. 2022 May 20;PP. doi: 10.1109/TNNLS.2022.3172986.
3
The Deterministic Information Bottleneck.确定性信息瓶颈
Neural Comput. 2017 Jun;29(6):1611-1630. doi: 10.1162/NECO_a_00961. Epub 2017 Apr 14.
4
On the Difference between the Information Bottleneck and the Deep Information Bottleneck.论信息瓶颈与深度信息瓶颈之间的差异。
Entropy (Basel). 2020 Jan 22;22(2):131. doi: 10.3390/e22020131.
5
Pareto-Optimal Data Compression for Binary Classification Tasks.用于二元分类任务的帕累托最优数据压缩
Entropy (Basel). 2019 Dec 19;22(1):7. doi: 10.3390/e22010007.
6
Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle.使用信息瓶颈原理学习基于神经网络的分类表示。
IEEE Trans Pattern Anal Mach Intell. 2020 Sep;42(9):2225-2239. doi: 10.1109/TPAMI.2019.2909031. Epub 2019 Apr 2.
7
Bottleneck Problems: An Information and Estimation-Theoretic View.瓶颈问题:信息与估计理论视角
Entropy (Basel). 2020 Nov 20;22(11):1325. doi: 10.3390/e22111325.
8
A Survey on Information Bottleneck.关于信息瓶颈的一项调查
IEEE Trans Pattern Anal Mach Intell. 2024 Aug;46(8):5325-5344. doi: 10.1109/TPAMI.2024.3366349. Epub 2024 Jul 2.
9
Gaussian Information Bottleneck and the Non-Perturbative Renormalization Group.高斯信息瓶颈与非微扰重整化群
New J Phys. 2022 Mar;24(3). doi: 10.1088/1367-2630/ac395d. Epub 2022 Mar 9.
10
Exact and Soft Successive Refinement of the Information Bottleneck.信息瓶颈的精确与软渐进细化
Entropy (Basel). 2023 Sep 19;25(9):1355. doi: 10.3390/e25091355.

引用本文的文献

1
Adaptive information-constrained mapping for feature compression in edge AI and federated systems.边缘人工智能和联邦系统中用于特征压缩的自适应信息约束映射
Sci Rep. 2025 Aug 22;15(1):30915. doi: 10.1038/s41598-025-16604-2.
2
Exploring the Trade-Off in the Variational Information Bottleneck for Regression with a Single Training Run.单次训练运行下探索回归变分信息瓶颈中的权衡
Entropy (Basel). 2024 Nov 30;26(12):1043. doi: 10.3390/e26121043.
3
Partial Information Decomposition: Redundancy as Information Bottleneck.部分信息分解:作为信息瓶颈的冗余度

本文引用的文献

1
Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle.使用信息瓶颈原理学习基于神经网络的分类表示。
IEEE Trans Pattern Anal Mach Intell. 2020 Sep;42(9):2225-2239. doi: 10.1109/TPAMI.2019.2909031. Epub 2019 Apr 2.
2
Efficient compression in color naming and its evolution.颜色命名中的高效压缩及其演变。
Proc Natl Acad Sci U S A. 2018 Jul 31;115(31):7937-7942. doi: 10.1073/pnas.1800521115. Epub 2018 Jul 18.
3
Information Dropout: Learning Optimal Representations Through Noisy Computation.
Entropy (Basel). 2024 Jun 26;26(7):546. doi: 10.3390/e26070546.
4
Artificial Intelligence Algorithm-Based MRI for Differentiation Diagnosis of Prostate Cancer.基于人工智能算法的 MRI 对前列腺癌的鉴别诊断。
Comput Math Methods Med. 2022 Jun 28;2022:8123643. doi: 10.1155/2022/8123643. eCollection 2022.
5
Information Bottleneck: Theory and Applications in Deep Learning.信息瓶颈:深度学习中的理论与应用
Entropy (Basel). 2020 Dec 14;22(12):1408. doi: 10.3390/e22121408.
6
On the Information Bottleneck Problems: Models, Connections, Applications and Information Theoretic Views.关于信息瓶颈问题:模型、联系、应用及信息论观点
Entropy (Basel). 2020 Jan 27;22(2):151. doi: 10.3390/e22020151.
信息丢失:通过噪声计算学习最优表示
IEEE Trans Pattern Anal Mach Intell. 2018 Dec;40(12):2897-2905. doi: 10.1109/TPAMI.2017.2784440. Epub 2018 Jan 10.
4
Toward a unified theory of efficient, predictive, and sparse coding.迈向高效、预测和稀疏编码的统一理论。
Proc Natl Acad Sci U S A. 2018 Jan 2;115(1):186-191. doi: 10.1073/pnas.1711114115. Epub 2017 Dec 19.
5
The Deterministic Information Bottleneck.确定性信息瓶颈
Neural Comput. 2017 Jun;29(6):1611-1630. doi: 10.1162/NECO_a_00961. Epub 2017 Apr 14.
6
Minimum cross-entropy pattern classification and cluster analysis.最小交叉熵模式分类和聚类分析。
IEEE Trans Pattern Anal Mach Intell. 1982 Jan;4(1):11-7. doi: 10.1109/tpami.1982.4767189.
7
Information-based clustering.基于信息的聚类
Proc Natl Acad Sci U S A. 2005 Dec 20;102(51):18297-302. doi: 10.1073/pnas.0507432102. Epub 2005 Dec 13.