• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

Multiassociative Memory: Recurrent Synapses Increase Storage Capacity.

作者信息

Gauy Marcelo Matheus, Meier Florian, Steger Angelika

机构信息

Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich 8092, Switzerland

Department of Computer Science, Institute of Theoretical Computer Science, ETH Zurich, Zurich 8092, Switzerland, and Collegium Helveticum, Zurich 8090, Switzerland

出版信息

Neural Comput. 2017 May;29(5):1375-1405. doi: 10.1162/NECO_a_00954. Epub 2017 Mar 23.

DOI:10.1162/NECO_a_00954
PMID:28333588
Abstract

The connection density of nearby neurons in the cortex has been observed to be around 0.1, whereas the longer-range connections are present with much sparser density (Kalisman, Silberberg, & Markram, 2005 ). We propose a memory association model that qualitatively explains these empirical observations. The model we consider is a multiassociative, sparse, Willshaw-like model consisting of binary threshold neurons and binary synapses. It uses recurrent synapses for iterative retrieval of stored memories. We quantify the usefulness of recurrent synapses by simulating the model for small network sizes and by doing a precise mathematical analysis for large network sizes. Given the network parameters, we can determine the precise values of recurrent and afferent synapse densities that optimize the storage capacity of the network. If the network size is like that of a cortical column, then the predicted optimal recurrent density lies in a range that is compatible with biological measurements. Furthermore, we show that our model is able to surpass the standard Willshaw model in the multiassociative case if the information capacity is normalized per strong synapse or per bits required to store the model, as considered in Knoblauch, Palm, and Sommer ( 2010 ).

摘要

相似文献

1
Multiassociative Memory: Recurrent Synapses Increase Storage Capacity.
Neural Comput. 2017 May;29(5):1375-1405. doi: 10.1162/NECO_a_00954. Epub 2017 Mar 23.
2
Memory capacity for sequences in a recurrent network with biological constraints.具有生物限制的循环网络中序列的记忆容量。
Neural Comput. 2006 Apr;18(4):904-41. doi: 10.1162/089976606775774714.
3
Efficient Associative Computation with Discrete Synapses.基于离散突触的高效关联计算。
Neural Comput. 2016 Jan;28(1):118-86. doi: 10.1162/NECO_a_00795. Epub 2015 Nov 24.
4
Memory capacity of networks with stochastic binary synapses.具有随机二进制突触的网络的记忆容量。
PLoS Comput Biol. 2014 Aug 7;10(8):e1003727. doi: 10.1371/journal.pcbi.1003727. eCollection 2014 Aug.
5
Memory capacities for synaptic and structural plasticity.突触和结构可塑性的记忆容量。
Neural Comput. 2010 Feb;22(2):289-341. doi: 10.1162/neco.2009.08-07-588.
6
Binary Willshaw learning yields high synaptic capacity for long-term familiarity memory.二元威尔肖学习为长期熟悉记忆产生高突触容量。
Biol Cybern. 2012 Feb;106(2):123-33. doi: 10.1007/s00422-012-0488-4. Epub 2012 Apr 6.
7
Dynamics of the CA3 pyramidal neuron autoassociative memory network in the hippocampus.海马体中CA3锥体神经元自联想记忆网络的动力学
Philos Trans R Soc Lond B Biol Sci. 1994 Jan 29;343(1304):167-87. doi: 10.1098/rstb.1994.0019.
8
Improved bidirectional retrieval of sparse patterns stored by Hebbian learning.通过赫布学习存储的稀疏模式的双向检索得到改进。
Neural Netw. 1999 Mar;12(2):281-297. doi: 10.1016/s0893-6080(98)00125-7.
9
Storage capacity of networks with discrete synapses and sparsely encoded memories.具有离散突触和稀疏编码记忆的网络的存储容量
Phys Rev E. 2022 May;105(5-1):054408. doi: 10.1103/PhysRevE.105.054408.
10
Bayesian retrieval in associative memories with storage errors.存在存储错误的联想记忆中的贝叶斯检索
IEEE Trans Neural Netw. 1998;9(4):705-13. doi: 10.1109/72.701183.

引用本文的文献

1
Adaptive Tuning Curve Widths Improve Sample Efficient Learning.自适应调谐曲线宽度可提高样本高效学习。
Front Comput Neurosci. 2020 Feb 18;14:12. doi: 10.3389/fncom.2020.00012. eCollection 2020.