Siri Benoît, Quoy Mathias, Delord Bruno, Cessac Bruno, Berry Hugues
INRIA, Futurs Research Centre, Project-Team Alchemy, 4 rue J Monod, 91893, Orsay Cedex, France.
J Physiol Paris. 2007 Jan-May;101(1-3):136-48. doi: 10.1016/j.jphysparis.2007.10.003. Epub 2007 Oct 16.
The aim of the present paper is to study the effects of Hebbian learning in random recurrent neural networks with biological connectivity, i.e. sparse connections and separate populations of excitatory and inhibitory neurons. We furthermore consider that the neuron dynamics may occur at a (shorter) time scale than synaptic plasticity and consider the possibility of learning rules with passive forgetting. We show that the application of such Hebbian learning leads to drastic changes in the network dynamics and structure. In particular, the learning rule contracts the norm of the weight matrix and yields a rapid decay of the dynamics complexity and entropy. In other words, the network is rewired by Hebbian learning into a new synaptic structure that emerges with learning on the basis of the correlations that progressively build up between neurons. We also observe that, within this emerging structure, the strongest synapses organize as a small-world network. The second effect of the decay of the weight matrix spectral radius consists in a rapid contraction of the spectral radius of the Jacobian matrix. This drives the system through the "edge of chaos" where sensitivity to the input pattern is maximal. Taken together, this scenario is remarkably predicted by theoretical arguments derived from dynamical systems and graph theory.
本文的目的是研究具有生物连接性(即稀疏连接以及兴奋性和抑制性神经元的分离群体)的随机递归神经网络中赫布学习的效果。此外,我们认为神经元动力学可能发生在比突触可塑性(更短)的时间尺度上,并考虑具有被动遗忘的学习规则的可能性。我们表明,这种赫布学习的应用会导致网络动力学和结构发生剧烈变化。特别是,学习规则会收缩权重矩阵的范数,并导致动力学复杂性和熵迅速衰减。换句话说,网络通过赫布学习重新布线成一种新的突触结构,这种结构会随着基于神经元之间逐渐建立的相关性的学习而出现。我们还观察到,在这种新兴结构中,最强的突触组织成一个小世界网络。权重矩阵谱半径衰减的第二个影响在于雅可比矩阵谱半径的快速收缩。这驱使系统通过“混沌边缘”,在那里对输入模式的敏感性最大。综合起来,这种情况由动力学系统和图论得出的理论论证得到了显著预测。