Department of Neurobiology and Anatomy, Medical School, The University of Texas, Houston, TX, USA.
Department of Experimental Radiation Oncology, The University of Texas MD Anderson Cancer Center, Houston, TX, USA.
J Comput Neurosci. 2022 Feb;50(1):121-132. doi: 10.1007/s10827-021-00797-2. Epub 2021 Oct 3.
Recurrent neural networks of spiking neurons can exhibit long lasting and even persistent activity. Such networks are often not robust and exhibit spike and firing rate statistics that are inconsistent with experimental observations. In order to overcome this problem most previous models had to assume that recurrent connections are dominated by slower NMDA type excitatory receptors. Usually, the single neurons within these networks are very simple leaky integrate and fire neurons or other low dimensional model neurons. However real neurons are much more complex, and exhibit a plethora of active conductances which are recruited both at the sub and supra threshold regimes. Here we show that by including a small number of additional active conductances we can produce recurrent networks that are both more robust and exhibit firing-rate statistics that are more consistent with experimental results. We show that this holds both for bi-stable recurrent networks, which are thought to underlie working memory and for slowly decaying networks which might underlie the estimation of interval timing. We also show that by including these conductances, such networks can be trained to using a simple learning rule to predict temporal intervals that are an order of magnitude larger than those that can be trained in networks of leaky integrate and fire neurons.
尖峰神经元的递归神经网络可以表现出持久的甚至是持续的活动。这样的网络通常不够健壮,并且表现出与实验观察不一致的尖峰和发放率统计数据。为了克服这个问题,大多数先前的模型都必须假设递归连接主要由较慢的 NMDA 型兴奋性受体主导。通常,这些网络中的单个神经元非常简单,是漏电流积分和放电神经元或其他低维模型神经元。然而,真实的神经元要复杂得多,并且表现出大量的活跃电导,这些电导在亚阈值和超阈值状态下都被招募。在这里,我们表明,通过包含少量额外的活跃电导,我们可以产生更健壮且更符合实验结果的发放率统计数据的递归网络。我们表明,这对于双稳态递归网络和缓慢衰减的网络都是成立的,前者被认为是工作记忆的基础,后者可能是间隔时间估计的基础。我们还表明,通过包含这些电导,这样的网络可以通过一个简单的学习规则进行训练,以预测比漏电流积分和放电神经元网络能够训练的时间间隔大一个数量级的时间间隔。