• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

关于赫布学习规则对离散时间随机递归神经网络的动力学和结构影响的数学分析。

A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks.

作者信息

Siri Benoît, Berry Hugues, Cessac Bruno, Delord Bruno, Quoy Mathias

机构信息

Team Alchemy, INRIA, Parc Club Orsay Université, Orsay Cedex, France.

出版信息

Neural Comput. 2008 Dec;20(12):2937-66. doi: 10.1162/neco.2008.05-07-530.

DOI:10.1162/neco.2008.05-07-530
PMID:18624656
Abstract

We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.

摘要

我们对随机递归神经网络中赫布学习的影响进行了数学分析,采用了通用的赫布学习规则,包括被动遗忘以及神经元活动和学习动力学的不同时间尺度。先前的数值研究报告称,赫布学习通过一系列分岔将系统从混沌驱动到稳态。在这里,我们从数学角度解释这些结果,并表明这些涉及神经元动力学和突触图结构之间复杂耦合的效应,可以使用雅可比矩阵进行分析,这为神经网络演化引入了结构和动力学两个视角。此外,我们表明当最大李雅普诺夫指数接近0时,对学习模式的敏感性最大。我们讨论了神经网络如何利用这种具有高度功能相关性的状态。

相似文献

1
A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks.关于赫布学习规则对离散时间随机递归神经网络的动力学和结构影响的数学分析。
Neural Comput. 2008 Dec;20(12):2937-66. doi: 10.1162/neco.2008.05-07-530.
2
Effects of Hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons.赫布学习对具有抑制性和兴奋性神经元的随机网络的动力学和结构的影响。
J Physiol Paris. 2007 Jan-May;101(1-3):136-48. doi: 10.1016/j.jphysparis.2007.10.003. Epub 2007 Oct 16.
3
The road to chaos by time-asymmetric Hebbian learning in recurrent neural networks.循环神经网络中由时间不对称赫布学习导致的混沌之路。
Neural Comput. 2007 Jan;19(1):80-110. doi: 10.1162/neco.2007.19.1.80.
4
A learning rule for the emergence of stable dynamics and timing in recurrent networks.一种用于循环神经网络中稳定动力学和时间出现的学习规则。
J Neurophysiol. 2005 Oct;94(4):2275-83. doi: 10.1152/jn.01250.2004.
5
[Dynamic paradigm in psychopathology: "chaos theory", from physics to psychiatry].[精神病理学中的动态范式:“混沌理论”,从物理学到精神病学]
Encephale. 2001 May-Jun;27(3):260-8.
6
Spontaneous dynamics of asymmetric random recurrent spiking neural networks.非对称随机递归脉冲神经网络的自发动力学
Neural Comput. 2006 Jan;18(1):60-79. doi: 10.1162/089976606774841567.
7
Bayesian spiking neurons II: learning.贝叶斯脉冲神经元II:学习
Neural Comput. 2008 Jan;20(1):118-45. doi: 10.1162/neco.2008.20.1.118.
8
Learning only when necessary: better memories of correlated patterns in networks with bounded synapses.仅在必要时学习:具有有限突触的网络中对相关模式的更好记忆。
Neural Comput. 2005 Oct;17(10):2106-38. doi: 10.1162/0899766054615644.
9
Hebbian spike-driven synaptic plasticity for learning patterns of mean firing rates.用于学习平均发放率模式的赫布型脉冲驱动突触可塑性。
Biol Cybern. 2002 Dec;87(5-6):459-70. doi: 10.1007/s00422-002-0356-8.
10
Learning rule of homeostatic synaptic scaling: presynaptic dependent or not.学习自平衡突触缩放规则:依赖于突触前还是不依赖于突触前。
Neural Comput. 2011 Dec;23(12):3145-61. doi: 10.1162/NECO_a_00210. Epub 2011 Sep 15.

引用本文的文献

1
Antifragile control systems in neuronal processing: a sensorimotor perspective.神经元处理中的抗脆弱控制系统:感觉运动视角
Biol Cybern. 2025 Feb 15;119(2-3):7. doi: 10.1007/s00422-025-01003-7.
2
Criticality, Connectivity, and Neural Disorder: A Multifaceted Approach to Neural Computation.临界性、连通性与神经紊乱:神经计算的多维度研究方法
Front Comput Neurosci. 2021 Feb 10;15:611183. doi: 10.3389/fncom.2021.611183. eCollection 2021.
3
Self-Optimization in Continuous-Time Recurrent Neural Networks.连续时间递归神经网络中的自优化
Front Robot AI. 2018 Aug 21;5:96. doi: 10.3389/frobt.2018.00096. eCollection 2018.
4
Dynamic Organization of Hierarchical Memories.层次记忆的动态组织
PLoS One. 2016 Sep 12;11(9):e0162640. doi: 10.1371/journal.pone.0162640. eCollection 2016.
5
Combined effects of LTP/LTD and synaptic scaling in formation of discrete and line attractors with persistent activity from non-trivial baseline.LTP/LTD 联合作用与突触缩放在非平凡基线持续活动中形成离散和线性吸引子。
Cogn Neurodyn. 2012 Dec;6(6):499-524. doi: 10.1007/s11571-012-9211-3. Epub 2012 Jul 14.
6
Effects of cellular homeostatic intrinsic plasticity on dynamical and computational properties of biological recurrent neural networks.细胞内稳态固有可塑性对生物递归神经网络动力学和计算特性的影响。
J Neurosci. 2013 Sep 18;33(38):15032-43. doi: 10.1523/JNEUROSCI.0870-13.2013.
7
A spike-timing pattern based neural network model for the study of memory dynamics.一种基于尖峰时间模式的神经网络模型用于记忆动力学研究。
PLoS One. 2009 Jul 24;4(7):e6247. doi: 10.1371/journal.pone.0006247.
8
On dynamics of integrate-and-fire neural networks with conductance based synapses.基于电导突触的积分发放神经网络动力学研究
Front Comput Neurosci. 2008 Jul 4;2:2. doi: 10.3389/neuro.10.002.2008. eCollection 2008.