Marzen Sarah
Physics of Living Systems Group, Department of Physics, Massachusetts Institute of Technology, Cambridge, MA 02139, USA.
Entropy (Basel). 2018 Aug 11;20(8):599. doi: 10.3390/e20080599.
Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process' "intrinsic computation". We discuss how statistical complexity changes with slight changes to the underlying model- in this case, a biologically-motivated dynamical model, that of a Monod-Wyman-Changeux molecule. Perturbations to kinetic rates cause statistical complexity to jump from finite to infinite. The same is not true for excess entropy, the mutual information between past and future, or for the molecule's transfer function. We discuss the implications of this for the relationship between intrinsic and functional computation of biological sensory systems.
因果状态是随机过程预测的最小充分统计量,其编码成本称为统计复杂度,隐含的因果结构产生了过程“内在计算”的概念。我们讨论了统计复杂度如何随基础模型的微小变化而变化——在这种情况下,是一个具有生物学动机的动力学模型,即莫诺-怀曼-尚热分子模型。动力学速率的扰动会导致统计复杂度从有限跃变为无限。对于过剩熵、过去与未来之间的互信息或分子的传递函数而言,情况并非如此。我们讨论了这对于生物感官系统内在计算与功能计算之间关系的影响。