Rivkind Alexander, Barak Omri
Faculty of Medicine, Technion-Israel Institute of Technology, Haifa 32000, Israel.
Network Biology Research Laboratories, Technion-Israel Institute of Technology, Haifa 32000, Israel.
Phys Rev Lett. 2017 Jun 23;118(25):258101. doi: 10.1103/PhysRevLett.118.258101.
Learning a task induces connectivity changes in neural circuits, thereby changing their dynamics. To elucidate task-related neural dynamics, we study trained recurrent neural networks. We develop a mean field theory for reservoir computing networks trained to have multiple fixed point attractors. Our main result is that the dynamics of the network's output in the vicinity of attractors is governed by a low-order linear ordinary differential equation. The stability of the resulting equation can be assessed, predicting training success or failure. As a consequence, networks of rectified linear units and of sigmoidal nonlinearities are shown to have diametrically different properties when it comes to learning attractors. Furthermore, a characteristic time constant, which remains finite at the edge of chaos, offers an explanation of the network's output robustness in the presence of variability of the internal neural dynamics. Finally, the proposed theory predicts state-dependent frequency selectivity in the network response.
学习一项任务会在神经回路中引发连接性变化,从而改变其动力学。为了阐明与任务相关的神经动力学,我们研究经过训练的递归神经网络。我们为训练有多个定点吸引子的储层计算网络开发了一种平均场理论。我们的主要结果是,网络输出在吸引子附近的动力学由一个低阶线性常微分方程支配。可以评估所得方程的稳定性,预测训练的成功或失败。因此,在学习吸引子方面,整流线性单元网络和S型非线性网络表现出截然不同的特性。此外,一个在混沌边缘保持有限的特征时间常数,解释了在内部神经动力学存在变异性的情况下网络输出的稳健性。最后,所提出的理论预测了网络响应中与状态相关的频率选择性。