Suppr超能文献

一种用于神经科学的快速简便的熵和互信息估计方法。

A Quick and Easy Way to Estimate Entropy and Mutual Information for Neuroscience.

作者信息

Zbili Mickael, Rama Sylvain

机构信息

Lyon Neuroscience Research Center (CRNL), Inserm U1028, CNRS UMR 5292, Université Claude Bernard Lyon1, Bron, France.

Laboratory of Synaptic Imaging, Department of Clinical and Experimental Epilepsy, UCL Queen Square Institute of Neurology, University College London, London, United Kingdom.

出版信息

Front Neuroinform. 2021 Jun 15;15:596443. doi: 10.3389/fninf.2021.596443. eCollection 2021.

Abstract

Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture non-linear interactions and are model independent. Yet the limited size and number of recordings one can collect in a series of experiments makes their calculation highly prone to sampling bias. Mathematical methods to overcome this so-called "sampling disaster" exist, but require significant expertise, great time and computational costs. As such, there is a need for a simple, unbiased and computationally efficient tool for estimating the level of entropy and mutual information. In this article, we propose that application of entropy-encoding compression algorithms widely used in text and image compression fulfill these requirements. By simply saving the signal in PNG picture format and measuring the size of the file on the hard drive, we can estimate entropy changes through different conditions. Furthermore, with some simple modifications of the PNG file, we can also estimate the evolution of mutual information between a stimulus and the observed responses through different conditions. We first demonstrate the applicability of this method using white-noise-like signals. Then, while this method can be used in all kind of experimental conditions, we provide examples of its application in patch-clamp recordings, detection of place cells and histological data. Although this method does not give an absolute value of entropy or mutual information, it is mathematically correct, and its simplicity and broad use make it a powerful tool for their estimation through experiments.

摘要

计算信号的熵或两个变量之间的互信息是神经科学领域中很有价值的分析工具。它们可应用于所有类型的数据,能捕捉非线性相互作用且与模型无关。然而,在一系列实验中能够收集到的记录数量有限且规模较小,这使得它们的计算极易出现采样偏差。虽然存在克服这种所谓“采样灾难”的数学方法,但需要专业知识、大量时间和计算成本。因此,需要一种简单、无偏差且计算效率高的工具来估计熵和互信息水平。在本文中,我们提出,广泛用于文本和图像压缩的熵编码压缩算法的应用满足了这些要求。通过简单地将信号保存为PNG图片格式并测量硬盘上文件的大小,我们可以估计不同条件下的熵变化。此外,对PNG文件进行一些简单修改,我们还可以估计刺激与不同条件下观察到的反应之间互信息的变化。我们首先使用类似白噪声的信号证明这种方法的适用性。然后,虽然这种方法可用于所有实验条件,但我们提供了其在膜片钳记录、位置细胞检测和组织学数据中的应用示例。虽然这种方法不能给出熵或互信息的绝对值,但在数学上是正确的,其简单性和广泛应用使其成为通过实验估计它们的有力工具。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f9b8/8239197/b629723d2453/fninf-15-596443-g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验