• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

循环神经网络中刺激类别的缓慢随机赫布学习

Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network.

作者信息

Brunel N, Carusi F, Fusi S

机构信息

Ecole Normale Supérieure, Paris, France.

出版信息

Network. 1998 Feb;9(1):123-52.

PMID:9861982
Abstract

We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number of stable states. Stimuli received by the network are drawn at random at each presentation from a set of classes. Each class is defined as a cluster in stimulus space, centred on the class prototype. The presentation protocol is chosen to mimic the protocols of visual memory experiments in which a set of stimuli is presented repeatedly in a random way. The statistics of the input stream may be stationary, or changing. Each stimulus induces, in a stochastic way, transitions between stable synaptic states. Learning dynamics is studied analytically in the slow learning limit, in which a given stimulus has to be presented many times before it is memorized, i.e. before synaptic modifications enable a pattern of activity correlated with the stimulus to become an attractor of the recurrent network. We show that in this limit the synaptic matrix becomes more correlated with the class prototypes than with any of the instances of the class. We also show that the number of classes that can be learned increases sharply when the coding level decreases, and determine the speeds of learning and forgetting of classes in the case of changes in the statistics of the input stream.

摘要

我们研究了循环网络中的无监督赫布学习,其中突触具有有限数量的稳定状态。网络接收到的刺激在每次呈现时从一组类别中随机抽取。每个类别被定义为刺激空间中的一个聚类,以类别原型为中心。选择呈现协议以模仿视觉记忆实验的协议,其中一组刺激以随机方式重复呈现。输入流的统计信息可以是固定的,也可以是变化的。每个刺激以随机方式诱导稳定突触状态之间的转换。在慢学习极限下对学习动力学进行了分析研究,在这种极限下,给定的刺激在被记忆之前必须呈现多次,即在突触修饰使与刺激相关的活动模式成为循环网络的吸引子之前。我们表明,在这个极限下,突触矩阵与类别原型的相关性比与类别的任何实例的相关性更高。我们还表明,当编码水平降低时,可以学习的类别数量会急剧增加,并确定了在输入流统计信息发生变化的情况下类别学习和遗忘的速度。

相似文献

1
Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network.循环神经网络中刺激类别的缓慢随机赫布学习
Network. 1998 Feb;9(1):123-52.
2
Learning viewpoint-invariant face representations from visual experience in an attractor network.在吸引子网络中从视觉经验学习视角不变的面部表征。
Network. 1998 Aug;9(3):399-417.
3
A mathematical analysis of the effects of Hebbian learning rules on the dynamics and structure of discrete-time random recurrent neural networks.关于赫布学习规则对离散时间随机递归神经网络的动力学和结构影响的数学分析。
Neural Comput. 2008 Dec;20(12):2937-66. doi: 10.1162/neco.2008.05-07-530.
4
The road to chaos by time-asymmetric Hebbian learning in recurrent neural networks.循环神经网络中由时间不对称赫布学习导致的混沌之路。
Neural Comput. 2007 Jan;19(1):80-110. doi: 10.1162/neco.2007.19.1.80.
5
Generalization and exclusive allocation of credit in unsupervised category learning.无监督类别学习中信用的泛化与排他性分配
Network. 1998 May;9(2):279-302.
6
Effects of Hebbian learning on the dynamics and structure of random networks with inhibitory and excitatory neurons.赫布学习对具有抑制性和兴奋性神经元的随机网络的动力学和结构的影响。
J Physiol Paris. 2007 Jan-May;101(1-3):136-48. doi: 10.1016/j.jphysparis.2007.10.003. Epub 2007 Oct 16.
7
Comparison of computational models of familiarity discrimination in the perirhinal cortex.嗅周皮层中熟悉性辨别计算模型的比较。
Hippocampus. 2003;13(4):494-524. doi: 10.1002/hipo.10093.
8
Learning transform invariant object recognition in the visual system with multiple stimuli present during training.在训练过程中存在多个刺激的情况下,在视觉系统中学习变换不变目标识别。
Neural Netw. 2008 Sep;21(7):888-903. doi: 10.1016/j.neunet.2007.11.004. Epub 2008 Apr 8.
9
Learning in realistic networks of spiking neurons and spike-driven plastic synapses.在具有脉冲发放神经元和脉冲驱动可塑性突触的真实网络中进行学习。
Eur J Neurosci. 2005 Jun;21(11):3143-60. doi: 10.1111/j.1460-9568.2005.04087.x.
10
Learning attractors in an asynchronous, stochastic electronic neural network.异步随机电子神经网络中的学习吸引子
Network. 1998 May;9(2):183-205. doi: 10.1088/0954-898x/9/2/003.

引用本文的文献

1
Semantic integration by pattern priming: experiment and cortical network model.通过模式启动进行语义整合:实验与皮层网络模型
Cogn Neurodyn. 2016 Dec;10(6):513-533. doi: 10.1007/s11571-016-9410-4. Epub 2016 Sep 17.
2
Dynamics of Competition between Subnetworks of Spiking Neuronal Networks in the Balanced State.平衡态下脉冲神经元网络子网络间的竞争动态
PLoS One. 2015 Sep 25;10(9):e0138947. doi: 10.1371/journal.pone.0138947. eCollection 2015.
3
A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.
一种三阈值学习规则接近循环神经网络的最大容量。
PLoS Comput Biol. 2015 Aug 20;11(8):e1004439. doi: 10.1371/journal.pcbi.1004439. eCollection 2015 Aug.
4
A high-capacity model for one shot association learning in the brain.大脑中单次关联学习的大容量模型。
Front Comput Neurosci. 2014 Nov 7;8:140. doi: 10.3389/fncom.2014.00140. eCollection 2014.
5
Inter-synaptic learning of combination rules in a cortical network model.皮质网络模型中的组合规则的突触间学习。
Front Psychol. 2014 Aug 28;5:842. doi: 10.3389/fpsyg.2014.00842. eCollection 2014.
6
Memory capacity of networks with stochastic binary synapses.具有随机二进制突触的网络的记忆容量。
PLoS Comput Biol. 2014 Aug 7;10(8):e1003727. doi: 10.1371/journal.pcbi.1003727. eCollection 2014 Aug.
7
Synaptic encoding of temporal contiguity.突触对时间连续性的编码。
Front Comput Neurosci. 2013 Apr 12;7:32. doi: 10.3389/fncom.2013.00032. eCollection 2013.
8
Soft-bound synaptic plasticity increases storage capacity.软束缚突触可塑性增加存储容量。
PLoS Comput Biol. 2012;8(12):e1002836. doi: 10.1371/journal.pcbi.1002836. Epub 2012 Dec 20.
9
Long memory lifetimes require complex synapses and limited sparseness.长记忆寿命需要复杂的突触和有限的稀疏性。
Front Comput Neurosci. 2007 Nov 30;1:7. doi: 10.3389/neuro.10.007.2007. eCollection 2007.
10
Universal memory mechanism for familiarity recognition and identification.用于熟悉度识别和鉴定的通用记忆机制。
J Neurosci. 2008 Jan 2;28(1):239-48. doi: 10.1523/JNEUROSCI.4799-07.2008.