Faculty of Life Sciences, University of Manchester Manchester, UK.
Front Neuroinform. 2009 Feb 11;3:4. doi: 10.3389/neuro.11.004.2009. eCollection 2009.
Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources.
信息论是一种存在噪声时的通信数学理论,它在现代定量神经科学中发挥着越来越重要的作用。它使我们能够将神经网络系统视为随机通信信道,并对其感觉编码功能获得有价值的、定量的见解。这些技术提供了关于神经元如何以一种不依赖于神经元响应的哪一部分是信号、哪一部分是噪声的任何特定假设的方式对刺激进行编码的结果,并且即使在传统技术失败的高度非线性系统中,它们也可以被有效地应用。在本文中,我们描述了我们使用 Python 进行信息论分析的工作和经验。我们概述了从神经数据计算信息论量时在算法、统计和数值方面的一些挑战。特别是,我们考虑了由于有限的采样偏差以及在存在代表系统中不同阶相互作用影响的约束的情况下计算最大熵分布而产生的问题。我们解释了为什么以及如何使用 Python 可以显著提高信息论算法的速度和适用范围,从而允许对具有更多变量的数据集进行分析。我们还讨论了我们对 Python 的使用如何促进与协作数据库和集中式计算资源的集成。