Chen Kevin S
Princeton Neuroscience Institute, Princeton University, Princeton, NJ 08544, USA.
Entropy (Basel). 2022 Apr 25;24(5):598. doi: 10.3390/e24050598.
The efficient coding hypothesis states that neural response should maximize its information about the external input. Theoretical studies focus on optimal response in single neuron and population code in networks with weak pairwise interactions. However, more biological settings with asymmetric connectivity and the encoding for dynamical stimuli have not been well-characterized. Here, we study the collective response in a kinetic Ising model that encodes the dynamic input. We apply gradient-based method and mean-field approximation to reconstruct networks given the neural code that encodes dynamic input patterns. We measure network asymmetry, decoding performance, and entropy production from networks that generate optimal population code. We analyze how stimulus correlation, time scale, and reliability of the network affect optimal encoding networks. Specifically, we find network dynamics altered by statistics of the dynamic input, identify stimulus encoding strategies, and show optimal effective temperature in the asymmetric networks. We further discuss how this approach connects to the Bayesian framework and continuous recurrent neural networks. Together, these results bridge concepts of nonequilibrium physics with the analyses of dynamics and coding in networks.
高效编码假说指出,神经反应应使其关于外部输入的信息最大化。理论研究聚焦于具有弱成对相互作用的网络中单个神经元的最优反应以及群体编码。然而,更多具有不对称连接性的生物学环境以及动态刺激的编码尚未得到充分表征。在此,我们研究编码动态输入的动力学伊辛模型中的集体反应。给定编码动态输入模式的神经编码,我们应用基于梯度的方法和平均场近似来重建网络。我们测量生成最优群体编码的网络的网络不对称性、解码性能和熵产生。我们分析刺激相关性、时间尺度和网络可靠性如何影响最优编码网络。具体而言,我们发现动态输入的统计数据会改变网络动态,识别刺激编码策略,并展示不对称网络中的最优有效温度。我们进一步讨论这种方法如何与贝叶斯框架和连续循环神经网络相联系。总之,这些结果将非平衡物理学的概念与网络动力学和编码分析联系起来。