Suppr超能文献

循环神经网络中刺激类别的缓慢随机赫布学习

Slow stochastic Hebbian learning of classes of stimuli in a recurrent neural network.

作者信息

Brunel N, Carusi F, Fusi S

机构信息

Ecole Normale Supérieure, Paris, France.

出版信息

Network. 1998 Feb;9(1):123-52.

PMID:9861982
Abstract

We study unsupervised Hebbian learning in a recurrent network in which synapses have a finite number of stable states. Stimuli received by the network are drawn at random at each presentation from a set of classes. Each class is defined as a cluster in stimulus space, centred on the class prototype. The presentation protocol is chosen to mimic the protocols of visual memory experiments in which a set of stimuli is presented repeatedly in a random way. The statistics of the input stream may be stationary, or changing. Each stimulus induces, in a stochastic way, transitions between stable synaptic states. Learning dynamics is studied analytically in the slow learning limit, in which a given stimulus has to be presented many times before it is memorized, i.e. before synaptic modifications enable a pattern of activity correlated with the stimulus to become an attractor of the recurrent network. We show that in this limit the synaptic matrix becomes more correlated with the class prototypes than with any of the instances of the class. We also show that the number of classes that can be learned increases sharply when the coding level decreases, and determine the speeds of learning and forgetting of classes in the case of changes in the statistics of the input stream.

摘要

我们研究了循环网络中的无监督赫布学习,其中突触具有有限数量的稳定状态。网络接收到的刺激在每次呈现时从一组类别中随机抽取。每个类别被定义为刺激空间中的一个聚类,以类别原型为中心。选择呈现协议以模仿视觉记忆实验的协议,其中一组刺激以随机方式重复呈现。输入流的统计信息可以是固定的,也可以是变化的。每个刺激以随机方式诱导稳定突触状态之间的转换。在慢学习极限下对学习动力学进行了分析研究,在这种极限下,给定的刺激在被记忆之前必须呈现多次,即在突触修饰使与刺激相关的活动模式成为循环网络的吸引子之前。我们表明,在这个极限下,突触矩阵与类别原型的相关性比与类别的任何实例的相关性更高。我们还表明,当编码水平降低时,可以学习的类别数量会急剧增加,并确定了在输入流统计信息发生变化的情况下类别学习和遗忘的速度。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验