• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

学习尖峰神经网络模型中工作记忆的突触和内在膜动力学。

Learning the Synaptic and Intrinsic Membrane Dynamics Underlying Working Memory in Spiking Neural Network Models.

机构信息

Computational Neurobiology Laboratory, Salk Institute for Biological Studies, La Jolla, CA 92037, U.S.A.

Computational Neurobiology Laboratory, Salk Institute for Biological Studies, La Jolla, CA 92037, and Neurosciences Graduate Program and Medical Scientist Training Program, University of California San Diego, La Jolla, CA 92093, U.S.A.

出版信息

Neural Comput. 2021 Nov 12;33(12):3264-3287. doi: 10.1162/neco_a_01409.

DOI:10.1162/neco_a_01409
PMID:34710902
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8662709/
Abstract

Recurrent neural network (RNN) models trained to perform cognitive tasks are a useful computational tool for understanding how cortical circuits execute complex computations. However, these models are often composed of units that interact with one another using continuous signals and overlook parameters intrinsic to spiking neurons. Here, we developed a method to directly train not only synaptic-related variables but also membrane-related parameters of a spiking RNN model. Training our model on a wide range of cognitive tasks resulted in diverse yet task-specific synaptic and membrane parameters. We also show that fast membrane time constants and slow synaptic decay dynamics naturally emerge from our model when it is trained on tasks associated with working memory (WM). Further dissecting the optimized parameters revealed that fast membrane properties are important for encoding stimuli, and slow synaptic dynamics are needed for WM maintenance. This approach offers a unique window into how connectivity patterns and intrinsic neuronal properties contribute to complex dynamics in neural populations.

摘要

递归神经网络(RNN)模型经过训练可以执行认知任务,是一种用于理解皮质电路如何执行复杂计算的有用计算工具。然而,这些模型通常由使用连续信号相互作用的单元组成,忽略了尖峰神经元固有的参数。在这里,我们开发了一种直接训练尖峰 RNN 模型不仅包括突触相关变量,还包括膜相关参数的方法。我们的模型在广泛的认知任务上进行训练,产生了不同但具有任务特异性的突触和膜参数。我们还表明,当我们的模型在与工作记忆(WM)相关的任务上进行训练时,快速的膜时间常数和缓慢的突触衰减动力学会自然出现。进一步剖析优化后的参数表明,快速的膜特性对于刺激的编码很重要,而缓慢的突触动力学对于 WM 的维持是必要的。这种方法为了解连接模式和内在神经元特性如何有助于神经群体中的复杂动态提供了一个独特的视角。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/1bdc554a77d3/neco_a_01409.figure.05.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/6da6ce6e4e7f/neco_a_01409.figure.01.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/e107f4745ea1/neco_a_01409.figure.02.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/bbe36f38ee9b/neco_a_01409.figure.03.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/9d1377d7370a/neco_a_01409.figure.04.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/1bdc554a77d3/neco_a_01409.figure.05.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/6da6ce6e4e7f/neco_a_01409.figure.01.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/e107f4745ea1/neco_a_01409.figure.02.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/bbe36f38ee9b/neco_a_01409.figure.03.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/9d1377d7370a/neco_a_01409.figure.04.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/1bdc554a77d3/neco_a_01409.figure.05.jpg

相似文献

1
Learning the Synaptic and Intrinsic Membrane Dynamics Underlying Working Memory in Spiking Neural Network Models.学习尖峰神经网络模型中工作记忆的突触和内在膜动力学。
Neural Comput. 2021 Nov 12;33(12):3264-3287. doi: 10.1162/neco_a_01409.
2
A Spiking Working Memory Model Based on Hebbian Short-Term Potentiation.基于赫布型短期增强的脉冲工作记忆模型
J Neurosci. 2017 Jan 4;37(1):83-96. doi: 10.1523/JNEUROSCI.1989-16.2016.
3
Strong inhibitory signaling underlies stable temporal dynamics and working memory in spiking neural networks.强抑制性信号是尖峰神经网络中稳定的时间动态和工作记忆的基础。
Nat Neurosci. 2021 Jan;24(1):129-139. doi: 10.1038/s41593-020-00753-w. Epub 2020 Dec 7.
4
Learning recurrent dynamics in spiking networks.学习尖峰网络中的循环动力学。
Elife. 2018 Sep 20;7:e37124. doi: 10.7554/eLife.37124.
5
Online Learning and Memory of Neural Trajectory Replays for Prefrontal Persistent and Dynamic Representations in the Irregular Asynchronous State.不规则异步状态下前额叶持续和动态表征的神经轨迹回放的在线学习与记忆
Front Neural Circuits. 2021 Jul 8;15:648538. doi: 10.3389/fncir.2021.648538. eCollection 2021.
6
Slow manifolds within network dynamics encode working memory efficiently and robustly.网络动力学中的慢流形高效且稳健地编码工作记忆。
PLoS Comput Biol. 2021 Sep 15;17(9):e1009366. doi: 10.1371/journal.pcbi.1009366. eCollection 2021 Sep.
7
Synaptic dynamics: linear model and adaptation algorithm.突触动力学:线性模型与自适应算法。
Neural Netw. 2014 Aug;56:49-68. doi: 10.1016/j.neunet.2014.04.001. Epub 2014 Apr 28.
8
Exact neural mass model for synaptic-based working memory.基于突触的工作记忆的精确神经质量模型。
PLoS Comput Biol. 2020 Dec 15;16(12):e1008533. doi: 10.1371/journal.pcbi.1008533. eCollection 2020 Dec.
9
Training Spiking Neural Networks in the Strong Coupling Regime.在强耦合 regime 中训练尖峰神经网络。
Neural Comput. 2021 Apr 13;33(5):1199-1233. doi: 10.1162/neco_a_01379.
10
Recurrent neural networks of integrate-and-fire cells simulating short-term memory and wrist movement tasks derived from continuous dynamic networks.基于连续动态网络的模拟短期记忆和手腕运动任务的积分发放细胞递归神经网络。
J Physiol Paris. 2003 Jul-Nov;97(4-6):601-12. doi: 10.1016/j.jphysparis.2004.01.017.

引用本文的文献

1
Slow ramping emerges from spontaneous fluctuations in spiking neural networks.缓慢的渐变源自尖峰神经网络中的自发波动。
Nat Commun. 2024 Aug 24;15(1):7285. doi: 10.1038/s41467-024-51401-x.
2
Spiking Recurrent Neural Networks Represent Task-Relevant Neural Sequences in Rule-Dependent Computation.脉冲递归神经网络在依赖规则的计算中表征与任务相关的神经序列。
Cognit Comput. 2023 Jul;15(4):1167-1189. doi: 10.1007/s12559-022-09994-2. Epub 2022 Feb 5.

本文引用的文献

1
Simple framework for constructing functional spiking recurrent neural networks.构建功能尖峰循环神经网络的简单框架。
Proc Natl Acad Sci U S A. 2019 Nov 5;116(45):22811-22820. doi: 10.1073/pnas.1905926116. Epub 2019 Oct 21.
2
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures.深入探索脉冲神经网络:VGG和残差架构。
Front Neurosci. 2019 Mar 7;13:95. doi: 10.3389/fnins.2019.00095. eCollection 2019.
3
Deep learning in spiking neural networks.深度学习在尖峰神经网络中的应用。
Neural Netw. 2019 Mar;111:47-63. doi: 10.1016/j.neunet.2018.12.002. Epub 2018 Dec 18.
4
Learning recurrent dynamics in spiking networks.学习尖峰网络中的循环动力学。
Elife. 2018 Sep 20;7:e37124. doi: 10.7554/eLife.37124.
5
Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks.在低秩递归神经网络中连接连通性、动态和计算。
Neuron. 2018 Aug 8;99(3):609-623.e29. doi: 10.1016/j.neuron.2018.07.003. Epub 2018 Jul 26.
6
State-dependent cell-type-specific membrane potential dynamics and unitary synaptic inputs in awake mice.清醒小鼠状态依赖的细胞类型特异性膜电位动力学和单位突触传入。
Elife. 2018 Jul 27;7:e35869. doi: 10.7554/eLife.35869.
7
Flexible Sensorimotor Computations through Rapid Reconfiguration of Cortical Dynamics.通过快速重新配置皮层动态实现灵活的感觉运动计算。
Neuron. 2018 Jun 6;98(5):1005-1019.e5. doi: 10.1016/j.neuron.2018.05.020.
8
Supervised learning in spiking neural networks with FORCE training.基于 FORCE 训练的尖峰神经网络监督学习。
Nat Commun. 2017 Dec 20;8(1):2208. doi: 10.1038/s41467-017-01827-3.
9
Strength and Diversity of Inhibitory Signaling Differentiates Primate Anterior Cingulate from Lateral Prefrontal Cortex.抑制性信号传导的强度和多样性使灵长类动物的前扣带回与外侧前额叶皮质有所区别。
J Neurosci. 2017 May 3;37(18):4717-4734. doi: 10.1523/JNEUROSCI.3757-16.2017. Epub 2017 Apr 5.
10
Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks.循环神经网络中符合生物学原理的学习再现了认知任务期间观察到的神经动力学。
Elife. 2017 Feb 23;6:e20899. doi: 10.7554/eLife.20899.