Kennel Matthew B, Shlens Jonathon, Abarbanel Henry D I, Chichilnisky E J
Institute for Nonlinear Science, University of California, San Diego, La Jolla, CA 92093-0402, USA.
Neural Comput. 2005 Jul;17(7):1531-76. doi: 10.1162/0899766053723050.
The entropy rate quantifies the amount of uncertainty or disorder produced by any dynamical system. In a spiking neuron, this uncertainty translates into the amount of information potentially encoded and thus the subject of intense theoretical and experimental investigation. Estimating this quantity in observed, experimental data is difficult and requires a judicious selection of probabilistic models, balancing between two opposing biases. We use a model weighting principle originally developed for lossless data compression, following the minimum description length principle. This weighting yields a direct estimator of the entropy rate, which, compared to existing methods, exhibits significantly less bias and converges faster in simulation. With Monte Carlo techinques, we estimate a Bayesian confidence interval for the entropy rate. In related work, we apply these ideas to estimate the information rates between sensory stimuli and neural responses in experimental data (Shlens, Kennel, Abarbanel, & Chichilnisky, in preparation).
熵率量化了任何动力系统产生的不确定性或无序程度。在一个发放脉冲的神经元中,这种不确定性转化为潜在编码的信息量,因此成为了深入的理论和实验研究的主题。在观测到的实验数据中估计这个量是困难的,需要明智地选择概率模型,在两种相反的偏差之间进行权衡。我们遵循最小描述长度原则,使用最初为无损数据压缩开发的模型加权原理。这种加权产生了熵率的直接估计器,与现有方法相比,它的偏差显著更小,并且在模拟中收敛更快。通过蒙特卡罗技术,我们估计了熵率的贝叶斯置信区间。在相关工作中,我们应用这些想法来估计实验数据中感觉刺激和神经反应之间的信息率(施伦斯、肯内尔、阿巴班内尔和奇奇林斯基,正在准备中)。