Lin I-Chun, Xing Dajun, Shapley Robert
Center for Neural Science, New York University, New York, NY 10003, USA.
J Comput Neurosci. 2012 Dec;33(3):559-72. doi: 10.1007/s10827-012-0401-0. Epub 2012 Jun 10.
One of the reasons the visual cortex has attracted the interest of computational neuroscience is that it has well-defined inputs. The lateral geniculate nucleus (LGN) of the thalamus is the source of visual signals to the primary visual cortex (V1). Most large-scale cortical network models approximate the spike trains of LGN neurons as simple Poisson point processes. However, many studies have shown that neurons in the early visual pathway are capable of spiking with high temporal precision and their discharges are not Poisson-like. To gain an understanding of how response variability in the LGN influences the behavior of V1, we study response properties of model V1 neurons that receive purely feedforward inputs from LGN cells modeled either as noisy leaky integrate-and-fire (NLIF) neurons or as inhomogeneous Poisson processes. We first demonstrate that the NLIF model is capable of reproducing many experimentally observed statistical properties of LGN neurons. Then we show that a V1 model in which the LGN input to a V1 neuron is modeled as a group of NLIF neurons produces higher orientation selectivity than the one with Poisson LGN input. The second result implies that statistical characteristics of LGN spike trains are important for V1's function. We conclude that physiologically motivated models of V1 need to include more realistic LGN spike trains that are less noisy than inhomogeneous Poisson processes.
视觉皮层吸引计算神经科学关注的原因之一是它具有明确的输入。丘脑的外侧膝状体核(LGN)是向初级视觉皮层(V1)输送视觉信号的源头。大多数大规模皮层网络模型将LGN神经元的尖峰序列近似为简单的泊松点过程。然而,许多研究表明,早期视觉通路中的神经元能够以高时间精度产生尖峰,并且它们的放电并非类似泊松分布。为了理解LGN中的反应变异性如何影响V1的行为,我们研究了模型V1神经元的反应特性,这些神经元接收来自建模为噪声泄漏积分发放(NLIF)神经元或非齐次泊松过程的LGN细胞的纯前馈输入。我们首先证明NLIF模型能够重现许多实验观察到的LGN神经元的统计特性。然后我们表明,将V1神经元的LGN输入建模为一组NLIF神经元的V1模型比具有泊松LGN输入的模型产生更高的方向选择性。第二个结果意味着LGN尖峰序列的统计特征对V1的功能很重要。我们得出结论,基于生理学的V1模型需要包含比非齐次泊松过程噪声更小的更现实的LGN尖峰序列。