Suppr超能文献

学习尖峰神经网络模型中工作记忆的突触和内在膜动力学。

Learning the Synaptic and Intrinsic Membrane Dynamics Underlying Working Memory in Spiking Neural Network Models.

机构信息

Computational Neurobiology Laboratory, Salk Institute for Biological Studies, La Jolla, CA 92037, U.S.A.

Computational Neurobiology Laboratory, Salk Institute for Biological Studies, La Jolla, CA 92037, and Neurosciences Graduate Program and Medical Scientist Training Program, University of California San Diego, La Jolla, CA 92093, U.S.A.

出版信息

Neural Comput. 2021 Nov 12;33(12):3264-3287. doi: 10.1162/neco_a_01409.

Abstract

Recurrent neural network (RNN) models trained to perform cognitive tasks are a useful computational tool for understanding how cortical circuits execute complex computations. However, these models are often composed of units that interact with one another using continuous signals and overlook parameters intrinsic to spiking neurons. Here, we developed a method to directly train not only synaptic-related variables but also membrane-related parameters of a spiking RNN model. Training our model on a wide range of cognitive tasks resulted in diverse yet task-specific synaptic and membrane parameters. We also show that fast membrane time constants and slow synaptic decay dynamics naturally emerge from our model when it is trained on tasks associated with working memory (WM). Further dissecting the optimized parameters revealed that fast membrane properties are important for encoding stimuli, and slow synaptic dynamics are needed for WM maintenance. This approach offers a unique window into how connectivity patterns and intrinsic neuronal properties contribute to complex dynamics in neural populations.

摘要

递归神经网络(RNN)模型经过训练可以执行认知任务,是一种用于理解皮质电路如何执行复杂计算的有用计算工具。然而,这些模型通常由使用连续信号相互作用的单元组成,忽略了尖峰神经元固有的参数。在这里,我们开发了一种直接训练尖峰 RNN 模型不仅包括突触相关变量,还包括膜相关参数的方法。我们的模型在广泛的认知任务上进行训练,产生了不同但具有任务特异性的突触和膜参数。我们还表明,当我们的模型在与工作记忆(WM)相关的任务上进行训练时,快速的膜时间常数和缓慢的突触衰减动力学会自然出现。进一步剖析优化后的参数表明,快速的膜特性对于刺激的编码很重要,而缓慢的突触动力学对于 WM 的维持是必要的。这种方法为了解连接模式和内在神经元特性如何有助于神经群体中的复杂动态提供了一个独特的视角。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f70f/8662709/6da6ce6e4e7f/neco_a_01409.figure.01.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验