• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用连续时间递归神经网络进行有限状态计算的兴奋网络。

Excitable networks for finite state computation with continuous time recurrent neural networks.

机构信息

Center for Systems, Dynamics and Control, Department of Mathematics, University of Exeter, Exeter, EX4 4QF, UK.

Department of Mathematics, University of Auckland, Auckland, 1142, New Zealand.

出版信息

Biol Cybern. 2021 Oct;115(5):519-538. doi: 10.1007/s00422-021-00895-5. Epub 2021 Oct 5.

DOI:10.1007/s00422-021-00895-5
PMID:34608540
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8589808/
Abstract

Continuous time recurrent neural networks (CTRNN) are systems of coupled ordinary differential equations that are simple enough to be insightful for describing learning and computation, from both biological and machine learning viewpoints. We describe a direct constructive method of realising finite state input-dependent computations on an arbitrary directed graph. The constructed system has an excitable network attractor whose dynamics we illustrate with a number of examples. The resulting CTRNN has intermittent dynamics: trajectories spend long periods of time close to steady-state, with rapid transitions between states. Depending on parameters, transitions between states can either be excitable (inputs or noise needs to exceed a threshold to induce the transition), or spontaneous (transitions occur without input or noise). In the excitable case, we show the threshold for excitability can be made arbitrarily sensitive.

摘要

连续时间递归神经网络 (CTRNN) 是一组耦合的常微分方程系统,从生物和机器学习的角度来看,它足够简单,可以帮助我们深入了解学习和计算。我们描述了一种直接的构造方法,用于在任意有向图上实现有限状态的输入相关计算。所构建的系统具有兴奋性网络吸引子,我们通过许多示例来说明其动力学。所得的 CTRNN 具有间歇性动力学:轨迹长时间接近稳态,状态之间的快速转换。根据参数的不同,状态之间的转换可以是兴奋性的(输入或噪声需要超过阈值才能引发转换),也可以是自发性的(转换在没有输入或噪声的情况下发生)。在兴奋性的情况下,我们展示了兴奋性的阈值可以任意敏感。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/972de9a157e2/422_2021_895_Fig18_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/e115dffaea15/422_2021_895_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/3cbe5a3b7b24/422_2021_895_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/6cc9aee76fe1/422_2021_895_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/b4e990172709/422_2021_895_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/fe122d84f60f/422_2021_895_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/ea99fca5adb6/422_2021_895_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/afdd67a74c6b/422_2021_895_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/39966ddf24ec/422_2021_895_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/cbb28611bbff/422_2021_895_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/44f92c09937a/422_2021_895_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/c0a24f9ce837/422_2021_895_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/808cb6ff5280/422_2021_895_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/8dccea6ee364/422_2021_895_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/85de21fafe5a/422_2021_895_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/ec304ce190b7/422_2021_895_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/661a4fe9233b/422_2021_895_Fig16_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/3183569422a0/422_2021_895_Fig17_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/972de9a157e2/422_2021_895_Fig18_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/e115dffaea15/422_2021_895_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/3cbe5a3b7b24/422_2021_895_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/6cc9aee76fe1/422_2021_895_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/b4e990172709/422_2021_895_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/fe122d84f60f/422_2021_895_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/ea99fca5adb6/422_2021_895_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/afdd67a74c6b/422_2021_895_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/39966ddf24ec/422_2021_895_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/cbb28611bbff/422_2021_895_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/44f92c09937a/422_2021_895_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/c0a24f9ce837/422_2021_895_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/808cb6ff5280/422_2021_895_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/8dccea6ee364/422_2021_895_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/85de21fafe5a/422_2021_895_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/ec304ce190b7/422_2021_895_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/661a4fe9233b/422_2021_895_Fig16_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/3183569422a0/422_2021_895_Fig17_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd63/8589808/972de9a157e2/422_2021_895_Fig18_HTML.jpg

相似文献

1
Excitable networks for finite state computation with continuous time recurrent neural networks.用连续时间递归神经网络进行有限状态计算的兴奋网络。
Biol Cybern. 2021 Oct;115(5):519-538. doi: 10.1007/s00422-021-00895-5. Epub 2021 Oct 5.
2
Sensitive Finite-State Computations Using a Distributed Network With a Noisy Network Attractor.使用具有噪声网络吸引子的分布式网络进行敏感有限状态计算。
IEEE Trans Neural Netw Learn Syst. 2018 Dec;29(12):5847-5858. doi: 10.1109/TNNLS.2018.2813404. Epub 2018 Apr 4.
3
A Continuous Time Dynamical Turing Machine.一台连续时间动态图灵机。
IEEE Trans Neural Netw Learn Syst. 2025 Apr;36(4):6503-6515. doi: 10.1109/TNNLS.2024.3397995. Epub 2025 Apr 4.
4
Network attractors and nonlinear dynamics of neural computation.神经网络吸引子与神经计算的非线性动力学。
Curr Opin Neurobiol. 2024 Feb;84:102818. doi: 10.1016/j.conb.2023.102818. Epub 2023 Dec 8.
5
Network capacity analysis for latent attractor computation.用于潜在吸引子计算的网络容量分析。
Network. 2003 May;14(2):273-302.
6
The dynamics of discrete-time computation, with application to recurrent neural networks and finite state machine extraction.离散时间计算的动力学,及其在递归神经网络和有限状态机提取中的应用。
Neural Comput. 1996 Aug 15;8(6):1135-78. doi: 10.1162/neco.1996.8.6.1135.
7
Geometric framework to predict structure from function in neural networks.用于从神经网络的功能预测结构的几何框架。
Phys Rev Res. 2022 Jun-Aug;4(2):023255. doi: 10.1103/physrevresearch.4.023255. Epub 2022 Jun 22.
8
Dynamics and computation of continuous attractors.连续吸引子的动力学与计算
Neural Comput. 2008 Apr;20(4):994-1025. doi: 10.1162/neco.2008.10-06-378.
9
Vector Symbolic Finite State Machines in Attractor Neural Networks.吸引子神经网络中的向量符号有限状态机
Neural Comput. 2024 Mar 21;36(4):549-595. doi: 10.1162/neco_a_01638.
10
Feedforward Approximations to Dynamic Recurrent Network Architectures.动态递归网络架构的前馈近似
Neural Comput. 2018 Feb;30(2):546-567. doi: 10.1162/neco_a_01042. Epub 2017 Nov 21.

引用本文的文献

1
English Lexical Analysis System of Machine Translation Based on Simple Recurrent Neural Network.基于简单递归神经网络的机器翻译英语词汇分析系统。
Comput Intell Neurosci. 2022 Jun 16;2022:9702112. doi: 10.1155/2022/9702112. eCollection 2022.

本文引用的文献

1
Before and beyond the Wilson-Cowan equations.在威尔逊-考恩方程之前和之后。
J Neurophysiol. 2020 May 1;123(5):1645-1656. doi: 10.1152/jn.00404.2019. Epub 2020 Mar 18.
2
Sensitive Finite-State Computations Using a Distributed Network With a Noisy Network Attractor.使用具有噪声网络吸引子的分布式网络进行敏感有限状态计算。
IEEE Trans Neural Netw Learn Syst. 2018 Dec;29(12):5847-5858. doi: 10.1109/TNNLS.2018.2813404. Epub 2018 Apr 4.
3
Convection in a rotating layer: a simple case of turbulence.旋转层中的对流:湍流的一个简单情形。
Science. 1980 Apr 11;208(4440):173-5. doi: 10.1126/science.208.4440.173.
4
Generation and reshaping of sequences in neural systems.神经系统中序列的生成与重塑。
Biol Cybern. 2006 Dec;95(6):519-36. doi: 10.1007/s00422-006-0121-5. Epub 2006 Nov 29.
5
On the origin of reproducible sequential activity in neural circuits.关于神经回路中可重复序列活动的起源
Chaos. 2004 Dec;14(4):1123-9. doi: 10.1063/1.1819625.
6
Dynamical encoding by networks of competing neuron groups: winnerless competition.由相互竞争的神经元群网络进行的动态编码:无胜者竞争
Phys Rev Lett. 2001 Aug 6;87(6):068102. doi: 10.1103/PhysRevLett.87.068102. Epub 2001 Jul 20.
7
Excitatory and inhibitory interactions in localized populations of model neurons.模型神经元局部群体中的兴奋性和抑制性相互作用。
Biophys J. 1972 Jan;12(1):1-24. doi: 10.1016/S0006-3495(72)86068-5.
8
"Neural" computation of decisions in optimization problems.优化问题中决策的“神经”计算。
Biol Cybern. 1985;52(3):141-52. doi: 10.1007/BF00339943.