Siri Benoît, Berry Hugues, Cessac Bruno, Delord Bruno, Quoy Mathias
Team Alchemy, INRIA, Parc Club Orsay Université, Orsay Cedex, France.
Neural Comput. 2008 Dec;20(12):2937-66. doi: 10.1162/neco.2008.05-07-530.
We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.
我们对随机递归神经网络中赫布学习的影响进行了数学分析,采用了通用的赫布学习规则,包括被动遗忘以及神经元活动和学习动力学的不同时间尺度。先前的数值研究报告称,赫布学习通过一系列分岔将系统从混沌驱动到稳态。在这里,我们从数学角度解释这些结果,并表明这些涉及神经元动力学和突触图结构之间复杂耦合的效应,可以使用雅可比矩阵进行分析,这为神经网络演化引入了结构和动力学两个视角。此外,我们表明当最大李雅普诺夫指数接近0时,对学习模式的敏感性最大。我们讨论了神经网络如何利用这种具有高度功能相关性的状态。