Zuckerman Mind, Brain, Behavior Institute, Columbia University, New York, New York, United States of America.
The Abdus Salam International Centre for Theoretical Physics, Trieste, Italy.
PLoS Comput Biol. 2022 Dec 5;18(12):e1010590. doi: 10.1371/journal.pcbi.1010590. eCollection 2022 Dec.
Neural circuits exhibit complex activity patterns, both spontaneously and evoked by external stimuli. Information encoding and learning in neural circuits depend on how well time-varying stimuli can control spontaneous network activity. We show that in firing-rate networks in the balanced state, external control of recurrent dynamics, i.e., the suppression of internally-generated chaotic variability, strongly depends on correlations in the input. A distinctive feature of balanced networks is that, because common external input is dynamically canceled by recurrent feedback, it is far more difficult to suppress chaos with common input into each neuron than through independent input. To study this phenomenon, we develop a non-stationary dynamic mean-field theory for driven networks. The theory explains how the activity statistics and the largest Lyapunov exponent depend on the frequency and amplitude of the input, recurrent coupling strength, and network size, for both common and independent input. We further show that uncorrelated inputs facilitate learning in balanced networks.
神经回路表现出复杂的活动模式,无论是自发的还是由外部刺激引起的。神经回路中的信息编码和学习取决于时变刺激能够控制自发网络活动的程度。我们表明,在平衡状态下的发放率网络中,对递归动力学的外部控制,即对内源性混沌变异性的抑制,强烈依赖于输入中的相关性。平衡网络的一个显著特点是,由于共同的外部输入被递归反馈动态抵消,因此,与通过独立输入相比,通过共同输入到每个神经元来抑制混沌要困难得多。为了研究这一现象,我们为驱动网络开发了一种非平稳动态平均场理论。该理论解释了活动统计数据和最大 Lyapunov 指数如何取决于输入的频率和幅度、递归耦合强度以及网络大小,无论是共同输入还是独立输入。我们进一步表明,不相关的输入有助于平衡网络中的学习。