• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

瓶颈问题:信息与估计理论视角

Bottleneck Problems: An Information and Estimation-Theoretic View.

作者信息

Asoodeh Shahab, Calmon Flavio P

机构信息

School of Engineering and Applied Science, Harvard University, Cambridge, MA 02138, USA.

出版信息

Entropy (Basel). 2020 Nov 20;22(11):1325. doi: 10.3390/e22111325.

DOI:10.3390/e22111325
PMID:33287090
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7712227/
Abstract

Information bottleneck (IB) and privacy funnel (PF) are two closely related optimization problems which have found applications in machine learning, design of privacy algorithms, capacity problems (e.g., Mrs. Gerber's Lemma), and strong data processing inequalities, among others. In this work, we first investigate the functional properties of IB and PF through a unified theoretical framework. We then connect them to three information-theoretic coding problems, namely hypothesis testing against independence, noisy source coding, and dependence dilution. Leveraging these connections, we prove a new cardinality bound on the auxiliary variable in IB, making its computation more tractable for discrete random variables. In the second part, we introduce a general family of optimization problems, termed "bottleneck problems", by replacing mutual information in IB and PF with other notions of mutual information, namely -information and Arimoto's mutual information. We then argue that, unlike IB and PF, these problems lead to easily interpretable guarantees in a variety of inference tasks with statistical constraints on accuracy and privacy. While the underlying optimization problems are non-convex, we develop a technique to evaluate bottleneck problems in closed form by equivalently expressing them in terms of lower convex or upper concave envelope of certain functions. By applying this technique to a binary case, we derive closed form expressions for several bottleneck problems.

摘要

信息瓶颈(IB)和隐私漏斗(PF)是两个密切相关的优化问题,它们在机器学习、隐私算法设计、容量问题(如格伯夫人引理)以及强数据处理不等式等方面都有应用。在这项工作中,我们首先通过一个统一的理论框架研究IB和PF的函数性质。然后,我们将它们与三个信息论编码问题联系起来,即针对独立性的假设检验、有噪源编码和相关性稀释。利用这些联系,我们证明了IB中辅助变量的一个新的基数界,使得其对于离散随机变量的计算更易于处理。在第二部分,我们通过用其他互信息概念,即 - 信息和有元互信息,替换IB和PF中的互信息,引入了一类称为“瓶颈问题”的一般优化问题。然后我们认为,与IB和PF不同,这些问题在各种对准确性和隐私有统计约束的推理任务中能给出易于解释的保证。虽然底层的优化问题是非凸的,但我们开发了一种技术,通过将它们等效地表示为某些函数的下凸包络或上凹包络,以封闭形式评估瓶颈问题。通过将该技术应用于二元情况,我们推导出了几个瓶颈问题的封闭形式表达式。

相似文献

1
Bottleneck Problems: An Information and Estimation-Theoretic View.瓶颈问题:信息与估计理论视角
Entropy (Basel). 2020 Nov 20;22(11):1325. doi: 10.3390/e22111325.
2
Recognizing Predictive Substructures With Subgraph Information Bottleneck.利用子图信息瓶颈识别预测性子结构。
IEEE Trans Pattern Anal Mach Intell. 2024 Mar;46(3):1650-1663. doi: 10.1109/TPAMI.2021.3112205. Epub 2024 Feb 6.
3
Exact and Soft Successive Refinement of the Information Bottleneck.信息瓶颈的精确与软渐进细化
Entropy (Basel). 2023 Sep 19;25(9):1355. doi: 10.3390/e25091355.
4
On the Information Bottleneck Problems: Models, Connections, Applications and Information Theoretic Views.关于信息瓶颈问题:模型、联系、应用及信息论观点
Entropy (Basel). 2020 Jan 27;22(2):151. doi: 10.3390/e22020151.
5
The Convex Information Bottleneck Lagrangian.凸信息瓶颈拉格朗日函数。
Entropy (Basel). 2020 Jan 14;22(1):98. doi: 10.3390/e22010098.
6
On the Difference between the Information Bottleneck and the Deep Information Bottleneck.论信息瓶颈与深度信息瓶颈之间的差异。
Entropy (Basel). 2020 Jan 22;22(2):131. doi: 10.3390/e22020131.
7
The Deterministic Information Bottleneck.确定性信息瓶颈
Neural Comput. 2017 Jun;29(6):1611-1630. doi: 10.1162/NECO_a_00961. Epub 2017 Apr 14.
8
Information-theoretic analysis of Hierarchical Temporal Memory-Spatial Pooler algorithm with a new upper bound for the standard information bottleneck method.基于标准信息瓶颈方法新上界的分层时间记忆-空间池化算法的信息论分析
Front Comput Neurosci. 2023 Jun 7;17:1140782. doi: 10.3389/fncom.2023.1140782. eCollection 2023.
9
Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle.使用信息瓶颈原理学习基于神经网络的分类表示。
IEEE Trans Pattern Anal Mach Intell. 2020 Sep;42(9):2225-2239. doi: 10.1109/TPAMI.2019.2909031. Epub 2019 Apr 2.
10
Large-Margin Multi-View Information Bottleneck.大间隔多视图信息瓶颈。
IEEE Trans Pattern Anal Mach Intell. 2014 Aug;36(8):1559-72. doi: 10.1109/TPAMI.2013.2296528.

引用本文的文献

1
Lossless Transformations and Excess Risk Bounds in Statistical Inference.统计推断中的无损变换与超额风险界
Entropy (Basel). 2023 Sep 28;25(10):1394. doi: 10.3390/e25101394.
2
Exact and Soft Successive Refinement of the Information Bottleneck.信息瓶颈的精确与软渐进细化
Entropy (Basel). 2023 Sep 19;25(9):1355. doi: 10.3390/e25091355.

本文引用的文献

1
On the Information Bottleneck Problems: Models, Connections, Applications and Information Theoretic Views.关于信息瓶颈问题:模型、联系、应用及信息论观点
Entropy (Basel). 2020 Jan 27;22(2):151. doi: 10.3390/e22020151.
2
Understanding Convolutional Neural Networks With Information Theory: An Initial Exploration.基于信息论理解卷积神经网络:初步探索
IEEE Trans Neural Netw Learn Syst. 2021 Jan;32(1):435-442. doi: 10.1109/TNNLS.2020.2968509. Epub 2021 Jan 4.
3
Distributed Variational Representation Learning.分布式变分表示学习
IEEE Trans Pattern Anal Mach Intell. 2021 Jan;43(1):120-138. doi: 10.1109/TPAMI.2019.2928806. Epub 2020 Dec 4.
4
Learning Representations for Neural Network-Based Classification Using the Information Bottleneck Principle.使用信息瓶颈原理学习基于神经网络的分类表示。
IEEE Trans Pattern Anal Mach Intell. 2020 Sep;42(9):2225-2239. doi: 10.1109/TPAMI.2019.2909031. Epub 2019 Apr 2.
5
The Information Bottleneck and Geometric Clustering.信息瓶颈与几何聚类
Neural Comput. 2019 Mar;31(3):596-612. doi: 10.1162/neco_a_01136. Epub 2018 Oct 12.
6
The Deterministic Information Bottleneck.确定性信息瓶颈
Neural Comput. 2017 Jun;29(6):1611-1630. doi: 10.1162/NECO_a_00961. Epub 2017 Apr 14.
7
How many clusters? An information-theoretic perspective.多少个聚类?一种信息论视角。
Neural Comput. 2004 Dec;16(12):2483-506. doi: 10.1162/0899766042321751.