Huang Wentao, Zhang Kechen
Key Laboratory of Cognition and Intelligence and Information Science Academy of China Electronics Technology Group Corporation, Beijing 100086, China.
Department of Biomedical Engineering, Johns Hopkins University School of Medicine, Baltimore, MD 21205, USA.
Entropy (Basel). 2019 Mar 4;21(3):243. doi: 10.3390/e21030243.
Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but this approach is restricted to continuous variables because the calculation of Fisher information requires derivatives with respect to the encoded variables. In this paper, we consider information-theoretic bounds and approximations of the mutual information based on Kullback-Leibler divergence and Rényi divergence. We propose several information metrics to approximate Shannon mutual information in the context of neural population coding. While our asymptotic formulas all work for discrete variables, one of them has consistent performance and high accuracy regardless of whether the encoded variables are discrete or continuous. We performed numerical simulations and confirmed that our approximation formulas were highly accurate for approximating the mutual information between the stimuli and the responses of a large neural population. These approximation formulas may potentially bring convenience to the applications of information theory to many practical and theoretical problems.
尽管香农互信息已被广泛使用,但对于许多实际问题,包括神经群体编码中的问题,其有效计算往往很困难。基于费希尔信息的渐近公式有时能为互信息提供准确的近似值,但这种方法仅限于连续变量,因为费希尔信息的计算需要对编码变量求导数。在本文中,我们考虑基于库尔贝克 - 莱布勒散度和雷尼散度的互信息的信息论界和近似值。我们提出了几种信息度量方法,以在神经群体编码的背景下近似香农互信息。虽然我们的渐近公式对离散变量都有效,但其中一个无论编码变量是离散还是连续,都具有一致的性能和高精度。我们进行了数值模拟,并证实我们的近似公式在近似大神经群体的刺激与反应之间的互信息时非常准确。这些近似公式可能会为信息论在许多实际和理论问题中的应用带来便利。