• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用对时间进行离散化的循环尖峰网络来学习时空信号。

Learning spatiotemporal signals using a recurrent spiking network that discretizes time.

机构信息

Department of Bioengineering, Imperial College London, London, United Kingdom.

Department of Mathematics, Imperial College London, London, United Kingdom.

出版信息

PLoS Comput Biol. 2020 Jan 21;16(1):e1007606. doi: 10.1371/journal.pcbi.1007606. eCollection 2020 Jan.

DOI:10.1371/journal.pcbi.1007606
PMID:31961853
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7028299/
Abstract

Learning to produce spatiotemporal sequences is a common task that the brain has to solve. The same neurons may be used to produce different sequential behaviours. The way the brain learns and encodes such tasks remains unknown as current computational models do not typically use realistic biologically-plausible learning. Here, we propose a model where a spiking recurrent network of excitatory and inhibitory spiking neurons drives a read-out layer: the dynamics of the driver recurrent network is trained to encode time which is then mapped through the read-out neurons to encode another dimension, such as space or a phase. Different spatiotemporal patterns can be learned and encoded through the synaptic weights to the read-out neurons that follow common Hebbian learning rules. We demonstrate that the model is able to learn spatiotemporal dynamics on time scales that are behaviourally relevant and we show that the learned sequences are robustly replayed during a regime of spontaneous activity.

摘要

学习生成时空序列是大脑必须解决的一项常见任务。相同的神经元可能用于产生不同的序列行为。由于当前的计算模型通常不使用现实的生物上合理的学习,因此大脑学习和编码此类任务的方式仍然未知。在这里,我们提出了一个模型,其中一个兴奋性和抑制性尖峰神经元的尖峰递归网络驱动一个读出层:驱动递归网络的动力学被训练来编码时间,然后通过读出神经元映射到另一个维度,例如空间或相位。通过跟随常见的赫布学习规则的读出神经元的突触权重,可以学习和编码不同的时空模式。我们证明该模型能够学习与行为相关的时间尺度上的时空动力学,并且我们表明在自发活动的范围内可以稳健地重放学习到的序列。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/866e358efd78/pcbi.1007606.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/c6b6769e9385/pcbi.1007606.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/245105ed0887/pcbi.1007606.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/bb6dfe4bb32a/pcbi.1007606.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/4d2ae6ae44e4/pcbi.1007606.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/bd42b15e6a4a/pcbi.1007606.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/3f8883853073/pcbi.1007606.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/866e358efd78/pcbi.1007606.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/c6b6769e9385/pcbi.1007606.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/245105ed0887/pcbi.1007606.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/bb6dfe4bb32a/pcbi.1007606.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/4d2ae6ae44e4/pcbi.1007606.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/bd42b15e6a4a/pcbi.1007606.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/3f8883853073/pcbi.1007606.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4627/7028299/866e358efd78/pcbi.1007606.g007.jpg

相似文献

1
Learning spatiotemporal signals using a recurrent spiking network that discretizes time.利用对时间进行离散化的循环尖峰网络来学习时空信号。
PLoS Comput Biol. 2020 Jan 21;16(1):e1007606. doi: 10.1371/journal.pcbi.1007606. eCollection 2020 Jan.
2
Learning recurrent dynamics in spiking networks.学习尖峰网络中的循环动力学。
Elife. 2018 Sep 20;7:e37124. doi: 10.7554/eLife.37124.
3
Learning precise spatiotemporal sequences via biophysically realistic learning rules in a modular, spiking network.通过模块化、脉冲神经网络中的生物物理现实学习规则学习精确的时空序列。
Elife. 2021 Mar 18;10:e63751. doi: 10.7554/eLife.63751.
4
Competitive STDP Learning of Overlapping Spatial Patterns.重叠空间模式的竞争性尖峰时间依赖可塑性学习
Neural Comput. 2015 Aug;27(8):1673-85. doi: 10.1162/NECO_a_00753. Epub 2015 Jun 16.
5
A Spiking Neural Network System for Robust Sequence Recognition.一种用于稳健序列识别的脉冲神经网络系统。
IEEE Trans Neural Netw Learn Syst. 2016 Mar;27(3):621-35. doi: 10.1109/TNNLS.2015.2416771. Epub 2015 Apr 14.
6
Resonance with subthreshold oscillatory drive organizes activity and optimizes learning in neural networks.亚阈值振荡驱动的共振组织神经网络中的活动并优化学习。
Proc Natl Acad Sci U S A. 2018 Mar 27;115(13):E3017-E3025. doi: 10.1073/pnas.1716933115. Epub 2018 Mar 15.
7
Reinforcement learning using a continuous time actor-critic framework with spiking neurons.使用具有尖峰神经元的连续时间动作 - 评论框架进行强化学习。
PLoS Comput Biol. 2013 Apr;9(4):e1003024. doi: 10.1371/journal.pcbi.1003024. Epub 2013 Apr 11.
8
Mirrored STDP Implements Autoencoder Learning in a Network of Spiking Neurons.镜像脉冲时间依赖可塑性在脉冲神经元网络中实现自动编码器学习。
PLoS Comput Biol. 2015 Dec 3;11(12):e1004566. doi: 10.1371/journal.pcbi.1004566. eCollection 2015 Dec.
9
Training a spiking neuronal network model of visual-motor cortex to play a virtual racket-ball game using reinforcement learning.使用强化学习训练视觉运动皮层的尖峰神经元网络模型来进行虚拟球拍游戏。
PLoS One. 2022 May 11;17(5):e0265808. doi: 10.1371/journal.pone.0265808. eCollection 2022.
10
A spiking neural network model of an actor-critic learning agent.一种基于演员-评论家学习智能体的脉冲神经网络模型。
Neural Comput. 2009 Feb;21(2):301-39. doi: 10.1162/neco.2008.08-07-593.

引用本文的文献

1
Hippocampal sequences represent working memory and implicit timing.海马序列代表工作记忆和内隐计时。
bioRxiv. 2025 Mar 17:2025.03.17.643736. doi: 10.1101/2025.03.17.643736.
2
A neural basis for learning sequential memory in brain loop structures.大脑回路结构中学习序列记忆的神经基础。
Front Comput Neurosci. 2024 Aug 5;18:1421458. doi: 10.3389/fncom.2024.1421458. eCollection 2024.
3
Composing recurrent spiking neural networks using locally-recurrent motifs and risk-mitigating architectural optimization.使用局部循环模式和风险缓解架构优化来构建循环脉冲神经网络。

本文引用的文献

1
From space to time: Spatial inhomogeneities lead to the emergence of spatiotemporal sequences in spiking neuronal networks.从空间到时间:时空不均匀性导致尖峰神经元网络中时空序列的出现。
PLoS Comput Biol. 2019 Oct 25;15(10):e1007432. doi: 10.1371/journal.pcbi.1007432. eCollection 2019 Oct.
2
Distinct role of flexible and stable encodings in sequential working memory.灵活和稳定编码在序列工作记忆中的不同作用。
Neural Netw. 2020 Jan;121:419-429. doi: 10.1016/j.neunet.2019.09.034. Epub 2019 Sep 28.
3
Reliable Sequential Activation of Neural Assemblies by Single Pyramidal Cells in a Three-Layered Cortex.
Front Neurosci. 2024 Jun 20;18:1412559. doi: 10.3389/fnins.2024.1412559. eCollection 2024.
4
Neural Sequences and the Encoding of Time.神经序列与时间编码。
Adv Exp Med Biol. 2024;1455:81-93. doi: 10.1007/978-3-031-60183-5_5.
5
Emergence of brain-inspired small-world spiking neural network through neuroevolution.通过神经进化产生受大脑启发的小世界脉冲神经网络。
iScience. 2024 Jan 9;27(2):108845. doi: 10.1016/j.isci.2024.108845. eCollection 2024 Feb 16.
6
Adaptive structure evolution and biologically plausible synaptic plasticity for recurrent spiking neural networks.递归尖峰神经网络的自适应结构进化和具有生物合理性的突触可塑性。
Sci Rep. 2023 Oct 7;13(1):16924. doi: 10.1038/s41598-023-43488-x.
7
Spiking Recurrent Neural Networks Represent Task-Relevant Neural Sequences in Rule-Dependent Computation.脉冲递归神经网络在依赖规则的计算中表征与任务相关的神经序列。
Cognit Comput. 2023 Jul;15(4):1167-1189. doi: 10.1007/s12559-022-09994-2. Epub 2022 Feb 5.
8
Editorial: Reproducibility in neuroscience.社论:神经科学中的可重复性
Front Integr Neurosci. 2023 Aug 25;17:1271818. doi: 10.3389/fnint.2023.1271818. eCollection 2023.
9
Long- and short-term history effects in a spiking network model of statistical learning.长短期历史效应在统计学习的尖峰网络模型中的作用。
Sci Rep. 2023 Aug 9;13(1):12939. doi: 10.1038/s41598-023-39108-3.
10
Memory rescue and learning in synaptic impaired neuronal circuits.突触受损神经回路中的记忆拯救与学习
iScience. 2023 May 29;26(7):106931. doi: 10.1016/j.isci.2023.106931. eCollection 2023 Jul 21.
单层皮层中单个锥体神经元对神经集合的可靠序贯激活。
Neuron. 2019 Oct 23;104(2):353-369.e5. doi: 10.1016/j.neuron.2019.07.017. Epub 2019 Aug 19.
4
A diversity of interneurons and Hebbian plasticity facilitate rapid compressible learning in the hippocampus.多种中间神经元和赫布可塑性促进了海马体的快速可压缩学习。
Nat Neurosci. 2019 Jul;22(7):1168-1181. doi: 10.1038/s41593-019-0415-2. Epub 2019 Jun 24.
5
Fundamental bounds on learning performance in neural circuits.神经回路学习性能的基本界限。
Proc Natl Acad Sci U S A. 2019 May 21;116(21):10537-10546. doi: 10.1073/pnas.1813416116. Epub 2019 May 6.
6
Neural mechanisms of attending to items in working memory.工作记忆中注意项目的神经机制。
Neurosci Biobehav Rev. 2019 Jun;101:1-12. doi: 10.1016/j.neubiorev.2019.03.017. Epub 2019 Mar 26.
7
Somatostatin-Expressing Interneurons Enable and Maintain Learning-Dependent Sequential Activation of Pyramidal Neurons.生长抑素表达中间神经元使并维持学习依赖性的锥体神经元的序列激活。
Neuron. 2019 Apr 3;102(1):202-216.e7. doi: 10.1016/j.neuron.2019.01.036. Epub 2019 Feb 18.
8
Unsupervised discovery of temporal sequences in high-dimensional datasets, with applications to neuroscience.无监督发现高维数据集的时间序列,及其在神经科学中的应用。
Elife. 2019 Feb 5;8:e38471. doi: 10.7554/eLife.38471.
9
Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks.在低秩递归神经网络中连接连通性、动态和计算。
Neuron. 2018 Aug 8;99(3):609-623.e29. doi: 10.1016/j.neuron.2018.07.003. Epub 2018 Jul 26.
10
Excitable neuronal assemblies with adaptation as a building block of brain circuits for velocity-controlled signal propagation.具有适应特性的可兴奋神经元集合,作为用于速度控制信号传播的脑回路的构建模块。
PLoS Comput Biol. 2018 Jul 6;14(7):e1006216. doi: 10.1371/journal.pcbi.1006216. eCollection 2018 Jul.