Suppr超能文献

通过循环可塑性对关联进行重新编码可增加记忆容量。

Re-encoding of associations by recurrent plasticity increases memory capacity.

作者信息

Medina Daniel, Leibold Christian

机构信息

Department Biologie II, Ludwig-Maximilians-Universität München Munich, Germany ; Bernstein Center for Computational Neuroscience Munich Munich, Germany.

出版信息

Front Synaptic Neurosci. 2014 Jun 10;6:13. doi: 10.3389/fnsyn.2014.00013. eCollection 2014.

Abstract

Recurrent networks have been proposed as a model of associative memory. In such models, memory items are stored in the strength of connections between neurons. These modifiable connections or synapses constitute a shared resource among all stored memories, limiting the capacity of the network. Synaptic plasticity at different time scales can play an important role in optimizing the representation of associative memories, by keeping them sparse, uncorrelated and non-redundant. Here, we use a model of sequence memory to illustrate how plasticity allows a recurrent network to self-optimize by gradually re-encoding the representation of its memory items. A learning rule is used to sparsify large patterns, i.e., patterns with many active units. As a result, pattern sizes become more homogeneous, which increases the network's dynamical stability during sequence recall and allows more patterns to be stored. Last, we show that the learning rule allows for online learning in that it keeps the network in a robust dynamical steady state while storing new memories and overwriting old ones.

摘要

循环神经网络已被提出作为联想记忆的一种模型。在这类模型中,记忆项存储在神经元之间连接的强度中。这些可修改的连接或突触构成了所有存储记忆之间的共享资源,限制了网络的容量。不同时间尺度下的突触可塑性通过使联想记忆保持稀疏、不相关和非冗余,在优化其表征方面可以发挥重要作用。在这里,我们使用一个序列记忆模型来说明可塑性如何使循环神经网络通过逐步重新编码其记忆项的表征来进行自我优化。一种学习规则被用于使大模式(即具有许多活跃单元的模式)变得稀疏。结果,模式大小变得更加均匀,这增加了序列回忆期间网络的动态稳定性,并允许存储更多模式。最后,我们表明该学习规则允许在线学习,因为它在存储新记忆和覆盖旧记忆时使网络保持在一个稳健的动态稳态。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8ab8/4051198/46ef5359d988/fnsyn-06-00013-g0001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验