• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

驱动随机递归网络中混沌边缘前的最优短期记忆。

Optimal short-term memory before the edge of chaos in driven random recurrent networks.

机构信息

Department of Information and Sciences, School of Arts and Sciences, Tokyo Woman's Christian University, 2-6-1 Zempukuji, Suginami-ku, Tokyo 167-8585, Japan.

Graduate School of Information Science and Technology, University of Tokyo, Bunkyo-ku, Tokyo 113-8656, Japan.

出版信息

Phys Rev E. 2019 Dec;100(6-1):062312. doi: 10.1103/PhysRevE.100.062312.

DOI:10.1103/PhysRevE.100.062312
PMID:31962477
Abstract

The ability of discrete-time nonlinear recurrent neural networks to store time-varying small input signals is investigated with mean-field theory. The combination of a small input strength and mean-field assumptions makes it possible to derive an approximate expression for the conditional probability density of the state of a neuron given a past input signal. From this conditional probability density, we can analytically calculate short-term memory measures, such as memory capacity, mutual information, and Fisher information, and determine the relationships among these measures, which have not been clarified to date to the best of our knowledge. We show that the network contribution of these short-term memory measures peaks before the edge of chaos, where the dynamics of input-driven networks is stable but corresponding systems without input signals are unstable.

摘要

用平均场理论研究了离散时间非线性递归神经网络存储时变小输入信号的能力。小输入强度和平均场假设的结合使得有可能推导出给定过去输入信号时神经元状态的条件概率密度的近似表达式。从这个条件概率密度中,我们可以分析计算短期记忆度量,例如记忆容量、互信息和 Fisher 信息,并确定这些度量之间的关系,这些关系在我们所知的范围内至今尚未得到明确。我们表明,这些短期记忆度量的网络贡献在混沌边缘之前达到峰值,在输入驱动网络的动力学稳定的情况下,但没有输入信号的相应系统是不稳定的。

相似文献

1
Optimal short-term memory before the edge of chaos in driven random recurrent networks.驱动随机递归网络中混沌边缘前的最优短期记忆。
Phys Rev E. 2019 Dec;100(6-1):062312. doi: 10.1103/PhysRevE.100.062312.
2
Real-time computation at the edge of chaos in recurrent neural networks.递归神经网络中混沌边缘的实时计算。
Neural Comput. 2004 Jul;16(7):1413-36. doi: 10.1162/089976604323057443.
3
Determination of the Edge of Criticality in Echo State Networks Through Fisher Information Maximization.通过 Fisher 信息量最大化确定回声状态网络的临界点。
IEEE Trans Neural Netw Learn Syst. 2018 Mar;29(3):706-717. doi: 10.1109/TNNLS.2016.2644268. Epub 2017 Jan 16.
4
Information processing in echo state networks at the edge of chaos.处于混沌边缘的回声状态网络中的信息处理
Theory Biosci. 2012 Sep;131(3):205-13. doi: 10.1007/s12064-011-0146-8. Epub 2011 Dec 7.
5
Dynamics and Information Import in Recurrent Neural Networks.循环神经网络中的动力学与信息输入
Front Comput Neurosci. 2022 Apr 27;16:876315. doi: 10.3389/fncom.2022.876315. eCollection 2022.
6
Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks.与输入相关的回声状态属性:探索递归神经网络的基本特征。
Neural Comput. 2013 Mar;25(3):671-96. doi: 10.1162/NECO_a_00411. Epub 2012 Dec 28.
7
Short-term memory in orthogonal neural networks.正交神经网络中的短期记忆。
Phys Rev Lett. 2004 Apr 9;92(14):148102. doi: 10.1103/PhysRevLett.92.148102.
8
Effect of recurrent infomax on the information processing capability of input-driven recurrent neural networks.循环信息最大化对输入驱动循环神经网络信息处理能力的影响。
Neurosci Res. 2020 Jul;156:225-233. doi: 10.1016/j.neures.2020.02.001. Epub 2020 Feb 14.
9
Network dynamics for optimal compressive-sensing input-signal recovery.用于最优压缩感知输入信号恢复的网络动力学
Phys Rev E Stat Nonlin Soft Matter Phys. 2014 Oct;90(4):042908. doi: 10.1103/PhysRevE.90.042908. Epub 2014 Oct 9.
10
Computational analysis of memory capacity in echo state networks.回声状态网络中记忆容量的计算分析。
Neural Netw. 2016 Nov;83:109-120. doi: 10.1016/j.neunet.2016.07.012. Epub 2016 Aug 16.

引用本文的文献

1
The 7 Muses of Neuro-Creative Cycle: How some patients with Parkinson's disease can unleash latent creativity.神经创意循环的七位缪斯女神:帕金森病患者如何激发潜在创造力。
AIMS Neurosci. 2025 Jun 23;12(2):250-283. doi: 10.3934/Neuroscience.2025014. eCollection 2025.
2
Selective consistency of recurrent neural networks induced by plasticity as a mechanism of unsupervised perceptual learning.可塑性诱导的递归神经网络的选择性一致性作为无监督感知学习的机制。
PLoS Comput Biol. 2024 Sep 3;20(9):e1012378. doi: 10.1371/journal.pcbi.1012378. eCollection 2024 Sep.
3
Chaotic neural dynamics facilitate probabilistic computations through sampling.
混沌神经网络动力学通过采样促进概率计算。
Proc Natl Acad Sci U S A. 2024 Apr 30;121(18):e2312992121. doi: 10.1073/pnas.2312992121. Epub 2024 Apr 22.
4
Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting.纠缠结构长短期记忆网络提升混沌时间序列预测
Entropy (Basel). 2021 Nov 11;23(11):1491. doi: 10.3390/e23111491.