Suppr超能文献

二元通信信道中互信息与相关性的关系

Mutual information against correlations in binary communication channels.

作者信息

Pregowska Agnieszka, Szczepanski Janusz, Wajnryb Eligiusz

机构信息

Institute of Fundamental Technological Research, Polish Academy of Sciences, Pawinskiego 5BWarsaw, PL.

出版信息

BMC Neurosci. 2015 May 19;16:32. doi: 10.1186/s12868-015-0168-0.

Abstract

BACKGROUND

Explaining how the brain processing is so fast remains an open problem (van Hemmen JL, Sejnowski T., 2004). Thus, the analysis of neural transmission (Shannon CE, Weaver W., 1963) processes basically focuses on searching for effective encoding and decoding schemes. According to the Shannon fundamental theorem, mutual information plays a crucial role in characterizing the efficiency of communication channels. It is well known that this efficiency is determined by the channel capacity that is already the maximal mutual information between input and output signals. On the other hand, intuitively speaking, when input and output signals are more correlated, the transmission should be more efficient. A natural question arises about the relation between mutual information and correlation. We analyze the relation between these quantities using the binary representation of signals, which is the most common approach taken in studying neuronal processes of the brain.

RESULTS

We present binary communication channels for which mutual information and correlation coefficients behave differently both quantitatively and qualitatively. Despite this difference in behavior, we show that the noncorrelation of binary signals implies their independence, in contrast to the case for general types of signals.

CONCLUSIONS

Our research shows that the mutual information cannot be replaced by sheer correlations. Our results indicate that neuronal encoding has more complicated nature which cannot be captured by straightforward correlations between input and output signals once the mutual information takes into account the structure and patterns of the signals.

摘要

背景

解释大脑处理速度为何如此之快仍是一个悬而未决的问题(范·赫门 JL,塞乔诺斯基 T.,2004 年)。因此,对神经传递(香农 CE,韦弗 W.,1963 年)过程的分析基本上集中在寻找有效的编码和解码方案上。根据香农基本定理,互信息在表征通信信道的效率方面起着关键作用。众所周知,这种效率由信道容量决定,而信道容量已经是输入和输出信号之间的最大互信息。另一方面,直观地说,当输入和输出信号的相关性更强时,传输应该更高效。于是就出现了一个关于互信息和相关性之间关系的自然问题。我们使用信号的二进制表示来分析这些量之间的关系,这是研究大脑神经元过程时最常用的方法。

结果

我们展示了二进制通信信道,对于这些信道,互信息和相关系数在定量和定性方面的表现都有所不同。尽管行为上存在这种差异,但我们表明,与一般类型信号的情况不同,二进制信号的不相关性意味着它们的独立性。

结论

我们的研究表明,互信息不能简单地用相关性来替代。我们的结果表明,神经元编码具有更复杂的本质,一旦互信息考虑了信号的结构和模式,就无法通过输入和输出信号之间的直接相关性来捕捉。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5f67/4445332/139bf6dc6b86/12868_2015_168_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验