Suppr超能文献

信息修剪:有效信道状态的充分统计量、互信息和可预测性。

Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states.

机构信息

Complexity Sciences Center and Physics Department, University of California at Davis, One Shields Avenue, Davis, California 95616, USA.

出版信息

Phys Rev E. 2017 Jun;95(6-1):060102. doi: 10.1103/PhysRevE.95.060102. Epub 2017 Jun 13.

Abstract

One of the most basic characterizations of the relationship between two random variables, X and Y, is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y) can be replaced by its minimal sufficient statistic about Y (or X) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X's minimal sufficient statistic preserves about Y is exactly the information that Y's minimal sufficient statistic preserves about X. We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

摘要

两个随机变量 X 和 Y 之间关系的最基本特征之一是它们的互信息值。不幸的是,分析计算和经验估计通常会受到变量极高维度的阻碍。人们可能希望用一个较小的变量来代替这样一个高维变量,而这个较小的变量保留了它与另一个变量的关系。众所周知,在保留互信息的情况下,可以用关于 Y(或 X)的最小充分统计量来代替 X(或 Y)。虽然直观上是合理的,但并不明显或直接的是,这两个变量可以同时被替换。我们证明了这实际上是可能的:X 的最小充分统计量保留的关于 Y 的信息,正是 Y 的最小充分统计量保留的关于 X 的信息。我们将这个过程称为信息修剪。作为一个重要的推论,我们考虑了一个变量是随机过程的过去,另一个变量是它的未来的情况。在这种情况下,互信息是信道有效状态之间的信道传输速率。也就是说,过去-未来互信息(额外熵)是使用过去可以预测未来的信息量。将我们关于最小充分统计量的结果转化,这相当于计算力学中正向和反向因果状态之间的互信息。最后,我们讨论了这种最小充分统计量在多变量中的扩展应用。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验