Suppr超能文献

关于赫布学习规则对离散时间随机递归神经网络的动力学和结构影响的数学分析。

A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks.

作者信息

Siri Benoît, Berry Hugues, Cessac Bruno, Delord Bruno, Quoy Mathias

机构信息

Team Alchemy, INRIA, Parc Club Orsay Université, Orsay Cedex, France.

出版信息

Neural Comput. 2008 Dec;20(12):2937-66. doi: 10.1162/neco.2008.05-07-530.

Abstract

We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.

摘要

我们对随机递归神经网络中赫布学习的影响进行了数学分析,采用了通用的赫布学习规则,包括被动遗忘以及神经元活动和学习动力学的不同时间尺度。先前的数值研究报告称,赫布学习通过一系列分岔将系统从混沌驱动到稳态。在这里,我们从数学角度解释这些结果,并表明这些涉及神经元动力学和突触图结构之间复杂耦合的效应,可以使用雅可比矩阵进行分析,这为神经网络演化引入了结构和动力学两个视角。此外,我们表明当最大李雅普诺夫指数接近0时,对学习模式的敏感性最大。我们讨论了神经网络如何利用这种具有高度功能相关性的状态。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验