Albanna Badr F, Hillar Christopher, Sohl-Dickstein Jascha, DeWeese Michael R
Department of Natural Sciences, Fordham University, New York, NY 10023, USA.
Department of Physics, University of California, Berkeley, CA 94720, USA.
Entropy (Basel). 2017 Aug 21;19(8):427. doi: 10.3390/e19080427.
Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured mean neural activities and pairwise correlations, but the full space of probability distributions consistent with these constraints has not been explored. We provide upper and lower bounds on the entropy for the entropy distribution over arbitrarily large collections of binary units with any fixed set of mean values and pairwise correlations. We also construct specific low-entropy distributions for several relevant cases. Surprisingly, the minimum entropy solution has entropy scaling logarithmically with system size for any set of first- and second-order statistics consistent with arbitrarily large systems. We further demonstrate that some sets of these low-order statistics can only be realized by small systems. Our results show how only small amounts of randomness are needed to mimic low-order statistical properties of highly entropic distributions, and we discuss some applications for engineered and biological information transmission systems.
最大熵模型越来越多地被用于通过测量的平均神经活动和成对相关性来描述神经群体的集体活动,但尚未探索与这些约束条件一致的概率分布的完整空间。我们给出了具有任意固定均值集和成对相关性的任意大量二元单元集合的熵分布的熵的上下界。我们还针对几个相关情况构造了特定的低熵分布。令人惊讶的是,对于与任意大系统一致的任何一阶和二阶统计集,最小熵解的熵随系统大小呈对数缩放。我们进一步证明,这些低阶统计的某些集合只能由小系统实现。我们的结果表明,只需少量随机性就能模拟高熵分布的低阶统计特性,并且我们讨论了工程和生物信息传输系统的一些应用。