• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

具有斑块连接和权值修剪的模块化吸引子联想记忆。

A modular attractor associative memory with patchy connectivity and weight pruning.

机构信息

Department of Computational Biology (CB), School of Computer Science and Communication (CSC), Royal Institute of Technology (KTH) , Stockholm , Sweden.

出版信息

Network. 2013;24(4):129-50. doi: 10.3109/0954898X.2013.859323.

DOI:10.3109/0954898X.2013.859323
PMID:24251411
Abstract

An important research topic in neuroscience is the study of mechanisms underlying memory and the estimation of the information capacity of the biological system. In this report we investigate the performance of a modular attractor network with recurrent connections similar to the cortical long-range connections extending in the horizontal direction. We considered a single learning rule, the BCPNN, which implements a kind of Hebbian learning and we trained the network with sparse random patterns. The storage capacity was measured experimentally for networks of size between 500 and 46 K units with a constant activity level, gradually diluting the connectivity. We show that the storage capacity of the modular network with patchy connectivity is comparable with the theoretical values estimated for simple associative memories and furthermore we introduce a new technique to prune the connectivity, which enhances the storage capacity up to the asymptotic value.

摘要

神经科学中的一个重要研究课题是研究记忆的机制和估计生物系统的信息容量。在本报告中,我们研究了具有类似于在水平方向上延伸的皮质长程连接的递归连接的模块化吸引子网络的性能。我们考虑了单个学习规则,即 BCPNN,它实现了一种赫布学习,并且我们使用稀疏随机模式对网络进行了训练。我们通过逐渐稀释连接,用大小在 500 到 46K 个单元之间的网络来实验性地测量存储容量,保持恒定的活动水平。我们表明,具有块状连接的模块化网络的存储容量与为简单联想记忆体估计的理论值相当,此外,我们还引入了一种新的技术来修剪连接,从而将存储容量提高到渐近值。

相似文献

1
A modular attractor associative memory with patchy connectivity and weight pruning.具有斑块连接和权值修剪的模块化吸引子联想记忆。
Network. 2013;24(4):129-50. doi: 10.3109/0954898X.2013.859323.
2
Network capacity analysis for latent attractor computation.用于潜在吸引子计算的网络容量分析。
Network. 2003 May;14(2):273-302.
3
A model of cortical associative memory based on a horizontal network of connected columns.一种基于相互连接的柱状结构水平网络的皮质联想记忆模型。
Network. 1998 May;9(2):235-64.
4
Storing structured sparse memories in a multi-modular cortical network model.在多模块皮层网络模型中存储结构化稀疏记忆。
J Comput Neurosci. 2016 Apr;40(2):157-75. doi: 10.1007/s10827-016-0590-z. Epub 2016 Feb 6.
5
Tree-like hierarchical associative memory structures.树状分层联想记忆结构。
Neural Netw. 2011 Mar;24(2):143-7. doi: 10.1016/j.neunet.2010.09.012. Epub 2010 Oct 7.
6
Memory capacity of balanced networks.平衡网络的记忆容量
Neural Comput. 2005 Mar;17(3):691-713. doi: 10.1162/0899766053019962.
7
Pseudo-relaxation learning algorithm for complex-valued associative memory.用于复值联想记忆的伪松弛学习算法。
Int J Neural Syst. 2008 Apr;18(2):147-56. doi: 10.1142/S0129065708001452.
8
Attractor dynamics in a modular network model of neocortex.新皮层模块化网络模型中的吸引子动力学
Network. 2006 Sep;17(3):253-76. doi: 10.1080/09548980600774619.
9
Neural associative memories and sparse coding.神经联想记忆和稀疏编码。
Neural Netw. 2013 Jan;37:165-71. doi: 10.1016/j.neunet.2012.08.013. Epub 2012 Sep 14.
10
Memory dynamics in attractor networks with saliency weights.吸引子网络中的记忆动力学与显著权重。
Neural Comput. 2010 Jul;22(7):1899-926. doi: 10.1162/neco.2010.07-09-1050.

引用本文的文献

1
Mapping the BCPNN Learning Rule to a Memristor Model.将BCPNN学习规则映射到忆阻器模型。
Front Neurosci. 2021 Dec 9;15:750458. doi: 10.3389/fnins.2021.750458. eCollection 2021.
2
Optimizing BCPNN Learning Rule for Memory Access.优化用于内存访问的BCPNN学习规则。
Front Neurosci. 2020 Aug 31;14:878. doi: 10.3389/fnins.2020.00878. eCollection 2020.
3
Probabilistic associative learning suffices for learning the temporal structure of multiple sequences.概率联想学习足以学习多个序列的时间结构。
PLoS One. 2019 Aug 1;14(8):e0220161. doi: 10.1371/journal.pone.0220161. eCollection 2019.
4
Functional Relevance of Different Basal Ganglia Pathways Investigated in a Spiking Model with Reward Dependent Plasticity.在具有奖励依赖可塑性的脉冲模型中研究不同基底神经节通路的功能相关性。
Front Neural Circuits. 2016 Jul 21;10:53. doi: 10.3389/fncir.2016.00053. eCollection 2016.
5
Storing structured sparse memories in a multi-modular cortical network model.在多模块皮层网络模型中存储结构化稀疏记忆。
J Comput Neurosci. 2016 Apr;40(2):157-75. doi: 10.1007/s10827-016-0590-z. Epub 2016 Feb 6.
6
Long-range recruitment of Martinotti cells causes surround suppression and promotes saliency in an attractor network model.马丁诺蒂细胞的远程募集在吸引子网络模型中导致周边抑制并增强显著性。
Front Neural Circuits. 2015 Oct 14;9:60. doi: 10.3389/fncir.2015.00060. eCollection 2015.
7
Reducing the computational footprint for real-time BCPNN learning.减少实时BCPNN学习的计算量。
Front Neurosci. 2015 Jan 22;9:2. doi: 10.3389/fnins.2015.00002. eCollection 2015.
8
A spiking neural network model of self-organized pattern recognition in the early mammalian olfactory system.哺乳动物早期嗅觉系统中自组织模式识别的尖峰神经网络模型。
Front Neural Circuits. 2014 Feb 7;8:5. doi: 10.3389/fncir.2014.00005. eCollection 2014.