• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

循环神经网络中序列索引与工作记忆的理论

A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks.

作者信息

Frady E Paxon, Kleyko Denis, Sommer Friedrich T

机构信息

Redwood Center for Theoretical Neuroscience, University of California, Berkeley, Berkeley, CA 94720, U.S.A.

Department of Computer Science, Electrical and Space Engineering, Lulea University of Technology, Lulea SE-971 87, Sweden

出版信息

Neural Comput. 2018 Jun;30(6):1449-1513. doi: 10.1162/neco_a_01084. Epub 2018 Apr 13.

DOI:10.1162/neco_a_01084
PMID:29652585
Abstract

To accommodate structured approaches of neural computation, we propose a class of recurrent neural networks for indexing and storing sequences of symbols or analog data vectors. These networks with randomized input weights and orthogonal recurrent weights implement coding principles previously described in vector symbolic architectures (VSA) and leverage properties of reservoir computing. In general, the storage in reservoir computing is lossy, and crosstalk noise limits the retrieval accuracy and information capacity. A novel theory to optimize memory performance in such networks is presented and compared with simulation experiments. The theory describes linear readout of analog data and readout with winner-take-all error correction of symbolic data as proposed in VSA models. We find that diverse VSA models from the literature have universal performance properties, which are superior to what previous analyses predicted. Further, we propose novel VSA models with the statistically optimal Wiener filter in the readout that exhibit much higher information capacity, in particular for storing analog data. The theory we present also applies to memory buffers, networks with gradual forgetting, which can operate on infinite data streams without memory overflow. Interestingly, we find that different forgetting mechanisms, such as attenuating recurrent weights or neural nonlinearities, produce very similar behavior if the forgetting time constants are matched. Such models exhibit extensive capacity when their forgetting time constant is optimized for given noise conditions and network size. These results enable the design of new types of VSA models for the online processing of data streams.

摘要

为了适应神经计算的结构化方法,我们提出了一类递归神经网络,用于对符号序列或模拟数据向量进行索引和存储。这些具有随机输入权重和正交递归权重的网络实现了先前在向量符号架构(VSA)中描述的编码原理,并利用了储层计算的特性。一般来说,储层计算中的存储是有损的,串扰噪声限制了检索精度和信息容量。本文提出了一种优化此类网络内存性能的新理论,并与模拟实验进行了比较。该理论描述了模拟数据的线性读出以及如VSA模型中所提出的符号数据的胜者全得误差校正读出。我们发现,文献中各种不同的VSA模型具有通用的性能特性,优于先前分析所预测的性能。此外,我们提出了在读出中采用统计最优维纳滤波器的新型VSA模型,其具有更高的信息容量,特别是在存储模拟数据方面。我们提出的理论也适用于内存缓冲区,即具有逐渐遗忘功能的网络,其可以在无限数据流上运行而不会出现内存溢出。有趣的是,我们发现,如果遗忘时间常数匹配,不同的遗忘机制,如衰减递归权重或神经非线性,会产生非常相似的行为。当这些模型的遗忘时间常数针对给定的噪声条件和网络大小进行优化时,它们表现出广泛的容量。这些结果有助于设计用于在线处理数据流的新型VSA模型。

相似文献

1
A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks.循环神经网络中序列索引与工作记忆的理论
Neural Comput. 2018 Jun;30(6):1449-1513. doi: 10.1162/neco_a_01084. Epub 2018 Apr 13.
2
Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons.二进制和模拟神经元的存储计算中的连接、动态和记忆。
Neural Comput. 2010 May;22(5):1272-311. doi: 10.1162/neco.2009.01-09-947.
3
Real-time computing without stable states: a new framework for neural computation based on perturbations.无稳定状态的实时计算:基于扰动的神经计算新框架。
Neural Comput. 2002 Nov;14(11):2531-60. doi: 10.1162/089976602760407955.
4
Competitive layer model of discrete-time recurrent neural networks with LT neurons.带 LT 神经元的离散时间递归神经网络的竞争层模型。
Neural Comput. 2010 Aug;22(8):2137-60. doi: 10.1162/NECO_a_00004-Zhou.
5
Short-term memory in orthogonal neural networks.正交神经网络中的短期记忆。
Phys Rev Lett. 2004 Apr 9;92(14):148102. doi: 10.1103/PhysRevLett.92.148102.
6
Noise tolerance of attractor and feedforward memory models.吸引子和前馈记忆模型的噪声容忍度。
Neural Comput. 2012 Feb;24(2):332-90. doi: 10.1162/NECO_a_00234. Epub 2011 Nov 17.
7
Joining distributed pattern processing and homeostatic plasticity in recurrent on-center off-surround shunting networks: noise, saturation, short-term memory, synaptic scaling, and BDNF.在递归同中心异侧抑制分流网络中结合分布式模式处理和内稳态可塑性:噪声、饱和、短期记忆、突触缩放和 BDNF。
Neural Netw. 2012 Jan;25(1):21-9. doi: 10.1016/j.neunet.2011.07.009. Epub 2011 Aug 12.
8
Efficient reinforcement learning of a reservoir network model of parametric working memory achieved with a cluster population winner-take-all readout mechanism.通过集群总体胜者全得读出机制实现参数工作记忆的储层网络模型的高效强化学习。
J Neurophysiol. 2015 Dec;114(6):3296-305. doi: 10.1152/jn.00378.2015. Epub 2015 Oct 7.
9
Short-term memory capacity in networks via the restricted isometry property.基于受限等距特性的网络中的短期记忆容量
Neural Comput. 2014 Jun;26(6):1198-235. doi: 10.1162/NECO_a_00590. Epub 2014 Mar 31.
10
Computational analysis of memory capacity in echo state networks.回声状态网络中记忆容量的计算分析。
Neural Netw. 2016 Nov;83:109-120. doi: 10.1016/j.neunet.2016.07.012. Epub 2016 Aug 16.

引用本文的文献

1
Maximizing theoretical and practical storage capacity in single-layer feedforward neural networks.最大化单层前馈神经网络的理论和实际存储容量。
Front Comput Neurosci. 2025 Aug 25;19:1646810. doi: 10.3389/fncom.2025.1646810. eCollection 2025.
2
Binding in hippocampal-entorhinal circuits enables compositionality in cognitive maps.海马-内嗅皮层回路中的绑定实现了认知地图中的组合性。
Adv Neural Inf Process Syst. 2024;37:39128-39157.
3
Principled neuromorphic reservoir computing.有原则的神经形态储层计算
Nat Commun. 2025 Jan 14;16(1):640. doi: 10.1038/s41467-025-55832-y.
4
Computing With Residue Numbers in High-Dimensional Representation.高维表示中的余数系统计算
Neural Comput. 2024 Dec 12;37(1):1-37. doi: 10.1162/neco_a_01723.
5
Binding in hippocampal-entorhinal circuits enables compositionality in cognitive maps.海马-内嗅皮层回路中的绑定使认知地图具有组合性。
ArXiv. 2024 Jun 27:arXiv:2406.18808v1.
6
Computing with Residue Numbers in High-Dimensional Representation.高维表示中的余数系统计算
ArXiv. 2023 Nov 8:arXiv:2311.04872v1.
7
Vector Symbolic Architectures as a Computing Framework for Emerging Hardware.作为新兴硬件计算框架的向量符号架构
Proc IEEE Inst Electr Electron Eng. 2022 Oct;110(10):1538-1571. Epub 2022 Oct 17.
8
Perceptron Theory Can Predict the Accuracy of Neural Networks.感知机理论可以预测神经网络的准确性。
IEEE Trans Neural Netw Learn Syst. 2024 Jul;35(7):9885-9899. doi: 10.1109/TNNLS.2023.3237381. Epub 2024 Jul 10.
9
On separating long- and short-term memories in hyperdimensional computing.关于在超维计算中分离长期记忆和短期记忆
Front Neurosci. 2023 Jan 9;16:867568. doi: 10.3389/fnins.2022.867568. eCollection 2022.
10
Associative memory of structured knowledge.结构化知识的联想记忆。
Sci Rep. 2022 Dec 17;12(1):21808. doi: 10.1038/s41598-022-25708-y.