• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

信息瓶颈的精确与软渐进细化

Exact and Soft Successive Refinement of the Information Bottleneck.

作者信息

Charvin Hippolyte, Catenacci Volpi Nicola, Polani Daniel

机构信息

School of Physics, Engineering and Computer Science, University of Hertfordshire, Hatfield AL10 9AB, UK.

出版信息

Entropy (Basel). 2023 Sep 19;25(9):1355. doi: 10.3390/e25091355.

DOI:10.3390/e25091355
PMID:37761653
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10528077/
Abstract

The information bottleneck (IB) framework formalises the essential requirement for efficient information processing systems to achieve an optimal balance between the complexity of their representation and the amount of information extracted about relevant features. However, since the representation complexity affordable by real-world systems may vary in time, the processing cost of updating the representations should also be taken into account. A crucial question is thus the extent to which adaptive systems can , which target the same relevant features but at a different granularity. We investigate the information-theoretic optimal limits of this process by studying and extending, within the IB framework, the notion of , which describes the ideal situation where no information needs to be discarded for adapting an IB-optimal representation's granularity. Thanks in particular to a new geometric characterisation, we analytically derive the successive refinability of some specific IB problems (for binary variables, for jointly Gaussian variables, and for the relevancy variable being a deterministic function of the source variable), and provide a linear-programming-based tool to numerically investigate, in the discrete case, the successive refinement of the IB. We then soften this notion into a of the loss of information optimality induced by several-stage processing through an existing measure of unique information. Simple numerical experiments suggest that this quantity is typically low, though not entirely negligible. These results could have important implications for (i) the structure and efficiency of incremental learning in biological and artificial agents, (ii) the comparison of IB-optimal observation channels in statistical decision problems, and (iii) the IB theory of deep neural networks.

摘要

信息瓶颈(IB)框架将高效信息处理系统的基本要求形式化,以在其表示的复杂性与关于相关特征提取的信息量之间实现最佳平衡。然而,由于现实世界系统能够承受的表示复杂性可能随时间变化,更新表示的处理成本也应予以考虑。因此,一个关键问题是自适应系统能够在何种程度上……,这些系统针对相同的相关特征,但粒度不同。我们通过在IB框架内研究和扩展……的概念来研究此过程的信息论最优极限,该概念描述了在调整IB最优表示的粒度时无需丢弃任何信息的理想情况。特别借助一种新的几何表征,我们解析地推导出一些特定IB问题(针对二元变量、联合高斯变量以及相关性变量是源变量的确定性函数的情况)的逐次可细化性,并提供一种基于线性规划的工具,用于在离散情况下数值研究IB的逐次细化。然后,我们通过现有的唯一信息度量将此概念软化,以衡量多阶段处理导致的信息最优性损失。简单的数值实验表明,这个量通常较低,尽管并非完全可以忽略不计。这些结果可能对(i)生物和人工智能体中增量学习的结构和效率,(ii)统计决策问题中IB最优观测通道的比较,以及(iii)深度神经网络的IB理论具有重要意义。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/e499cb9a0d72/entropy-25-01355-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/70f9e48ad14c/entropy-25-01355-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/27e8b949979e/entropy-25-01355-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/de7256339846/entropy-25-01355-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/e514263d19e2/entropy-25-01355-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/88cffce0a232/entropy-25-01355-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/1c5615fa5626/entropy-25-01355-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/13e251897eb9/entropy-25-01355-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/577ce901d7c0/entropy-25-01355-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/05dcba18c46d/entropy-25-01355-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/b8af4511a8cc/entropy-25-01355-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/e857c48f29ed/entropy-25-01355-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/f8c7b1c801de/entropy-25-01355-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/f66e26b64e6b/entropy-25-01355-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/bc7dc717488a/entropy-25-01355-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/8b0d7480dff9/entropy-25-01355-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/e499cb9a0d72/entropy-25-01355-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/70f9e48ad14c/entropy-25-01355-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/27e8b949979e/entropy-25-01355-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/de7256339846/entropy-25-01355-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/e514263d19e2/entropy-25-01355-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/88cffce0a232/entropy-25-01355-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/1c5615fa5626/entropy-25-01355-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/13e251897eb9/entropy-25-01355-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/577ce901d7c0/entropy-25-01355-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/05dcba18c46d/entropy-25-01355-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/b8af4511a8cc/entropy-25-01355-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/e857c48f29ed/entropy-25-01355-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/f8c7b1c801de/entropy-25-01355-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/f66e26b64e6b/entropy-25-01355-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/bc7dc717488a/entropy-25-01355-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/8b0d7480dff9/entropy-25-01355-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7115/10528077/e499cb9a0d72/entropy-25-01355-g012.jpg

相似文献

1
Exact and Soft Successive Refinement of the Information Bottleneck.信息瓶颈的精确与软渐进细化
Entropy (Basel). 2023 Sep 19;25(9):1355. doi: 10.3390/e25091355.
2
Gaussian Information Bottleneck and the Non-Perturbative Renormalization Group.高斯信息瓶颈与非微扰重整化群
New J Phys. 2022 Mar;24(3). doi: 10.1088/1367-2630/ac395d. Epub 2022 Mar 9.
3
Macromolecular crowding: chemistry and physics meet biology (Ascona, Switzerland, 10-14 June 2012).大分子拥挤现象:化学与物理邂逅生物学(瑞士阿斯科纳,2012年6月10日至14日)
Phys Biol. 2013 Aug;10(4):040301. doi: 10.1088/1478-3975/10/4/040301. Epub 2013 Aug 2.
4
Bottleneck Problems: An Information and Estimation-Theoretic View.瓶颈问题:信息与估计理论视角
Entropy (Basel). 2020 Nov 20;22(11):1325. doi: 10.3390/e22111325.
5
Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle.使用信息瓶颈原理学习基于神经网络的分类表示。
IEEE Trans Pattern Anal Mach Intell. 2020 Sep;42(9):2225-2239. doi: 10.1109/TPAMI.2019.2909031. Epub 2019 Apr 2.
6
The Deterministic Information Bottleneck.确定性信息瓶颈
Neural Comput. 2017 Jun;29(6):1611-1630. doi: 10.1162/NECO_a_00961. Epub 2017 Apr 14.
7
A Survey on Information Bottleneck.关于信息瓶颈的一项调查
IEEE Trans Pattern Anal Mach Intell. 2024 Aug;46(8):5325-5344. doi: 10.1109/TPAMI.2024.3366349. Epub 2024 Jul 2.
8
Perturbation Theory for the Information Bottleneck.信息瓶颈的微扰理论
Adv Neural Inf Process Syst. 2021 Dec;34:21008-21018.
9
Adversarial Information Bottleneck.对抗性信息瓶颈
IEEE Trans Neural Netw Learn Syst. 2022 May 20;PP. doi: 10.1109/TNNLS.2022.3172986.
10
On the Information Bottleneck Problems: Models, Connections, Applications and Information Theoretic Views.关于信息瓶颈问题:模型、联系、应用及信息论观点
Entropy (Basel). 2020 Jan 27;22(2):151. doi: 10.3390/e22020151.

本文引用的文献

1
Emergence of common concepts, symmetries and conformity in agent groups-an information-theoretic model.智能体群体中共同概念、对称性和一致性的出现——一种信息论模型
Interface Focus. 2023 Apr 14;13(3):20230006. doi: 10.1098/rsfs.2023.0006. eCollection 2023 Jun 6.
2
Perturbation Theory for the Information Bottleneck.信息瓶颈的微扰理论
Adv Neural Inf Process Syst. 2021 Dec;34:21008-21018.
3
Gaussian Information Bottleneck and the Non-Perturbative Renormalization Group.高斯信息瓶颈与非微扰重整化群
New J Phys. 2022 Mar;24(3). doi: 10.1088/1367-2630/ac395d. Epub 2022 Mar 9.
4
Maximally efficient prediction in the early fly visual system may support evasive flight maneuvers.在早期果蝇视觉系统中实现最高效的预测,可能有助于其做出逃避飞行的机动动作。
PLoS Comput Biol. 2021 May 20;17(5):e1008965. doi: 10.1371/journal.pcbi.1008965. eCollection 2021 May.
5
Optimal prediction with resource constraints using the information bottleneck.在资源约束下使用信息瓶颈进行最优预测。
PLoS Comput Biol. 2021 Mar 8;17(3):e1008743. doi: 10.1371/journal.pcbi.1008743. eCollection 2021 Mar.
6
Bottleneck Problems: An Information and Estimation-Theoretic View.瓶颈问题:信息与估计理论视角
Entropy (Basel). 2020 Nov 20;22(11):1325. doi: 10.3390/e22111325.
7
Space Emerges from What We Know-Spatial Categorisations Induced by Information Constraints.空间源于我们已知之物——信息约束引发的空间分类
Entropy (Basel). 2020 Oct 19;22(10):1179. doi: 10.3390/e22101179.
8
On the Information Bottleneck Problems: Models, Connections, Applications and Information Theoretic Views.关于信息瓶颈问题:模型、联系、应用及信息论观点
Entropy (Basel). 2020 Jan 27;22(2):151. doi: 10.3390/e22020151.
9
Universality of Logarithmic Loss in Successive Refinement.连续细化中对数损失的普遍性。
Entropy (Basel). 2019 Feb 8;21(2):158. doi: 10.3390/e21020158.
10
Efficient compression in color naming and its evolution.颜色命名中的高效压缩及其演变。
Proc Natl Acad Sci U S A. 2018 Jul 31;115(31):7937-7942. doi: 10.1073/pnas.1800521115. Epub 2018 Jul 18.