Computational Neurobiology Laboratory, Salk Institute for Biological Studies, La Jolla, CA 92037, U.S.A.
Computational Neurobiology Laboratory, Salk Institute for Biological Studies, La Jolla, CA 92037, and Neurosciences Graduate Program and Medical Scientist Training Program, University of California San Diego, La Jolla, CA 92093, U.S.A.
Neural Comput. 2021 Nov 12;33(12):3264-3287. doi: 10.1162/neco_a_01409.
Recurrent neural network (RNN) models trained to perform cognitive tasks are a useful computational tool for understanding how cortical circuits execute complex computations. However, these models are often composed of units that interact with one another using continuous signals and overlook parameters intrinsic to spiking neurons. Here, we developed a method to directly train not only synaptic-related variables but also membrane-related parameters of a spiking RNN model. Training our model on a wide range of cognitive tasks resulted in diverse yet task-specific synaptic and membrane parameters. We also show that fast membrane time constants and slow synaptic decay dynamics naturally emerge from our model when it is trained on tasks associated with working memory (WM). Further dissecting the optimized parameters revealed that fast membrane properties are important for encoding stimuli, and slow synaptic dynamics are needed for WM maintenance. This approach offers a unique window into how connectivity patterns and intrinsic neuronal properties contribute to complex dynamics in neural populations.
递归神经网络(RNN)模型经过训练可以执行认知任务,是一种用于理解皮质电路如何执行复杂计算的有用计算工具。然而,这些模型通常由使用连续信号相互作用的单元组成,忽略了尖峰神经元固有的参数。在这里,我们开发了一种直接训练尖峰 RNN 模型不仅包括突触相关变量,还包括膜相关参数的方法。我们的模型在广泛的认知任务上进行训练,产生了不同但具有任务特异性的突触和膜参数。我们还表明,当我们的模型在与工作记忆(WM)相关的任务上进行训练时,快速的膜时间常数和缓慢的突触衰减动力学会自然出现。进一步剖析优化后的参数表明,快速的膜特性对于刺激的编码很重要,而缓慢的突触动力学对于 WM 的维持是必要的。这种方法为了解连接模式和内在神经元特性如何有助于神经群体中的复杂动态提供了一个独特的视角。