Mukherjee Trishna, Liu Bing, Simoncini Claudio, Osborne Leslie C
Department of Neurobiology and.
Department of Neurobiology and
J Neurosci. 2017 Feb 8;37(6):1394-1412. doi: 10.1523/JNEUROSCI.2682-16.2016. Epub 2016 Dec 21.
Despite the enduring interest in motion integration, a direct measure of the space-time filter that the brain imposes on a visual scene has been elusive. This is perhaps because of the challenge of estimating a 3D function from perceptual reports in psychophysical tasks. We take a different approach. We exploit the close connection between visual motion estimates and smooth pursuit eye movements to measure stimulus-response correlations across space and time, computing the linear space-time filter for global motion direction in humans and monkeys. Although derived from eye movements, we find that the filter predicts perceptual motion estimates quite well. To distinguish visual from motor contributions to the temporal duration of the pursuit motion filter, we recorded single-unit responses in the monkey middle temporal cortical area (MT). We find that pursuit response delays are consistent with the distribution of cortical neuron latencies and that temporal motion integration for pursuit is consistent with a short integration MT subpopulation. Remarkably, the visual system appears to preferentially weight motion signals across a narrow range of foveal eccentricities rather than uniformly over the whole visual field, with a transiently enhanced contribution from locations along the direction of motion. We find that the visual system is most sensitive to motion falling at approximately one-third the radius of the stimulus aperture. Hypothesizing that the visual drive for pursuit is related to the filtered motion energy in a motion stimulus, we compare measured and predicted eye acceleration across several other target forms. A compact model of the spatial and temporal processing underlying global motion perception has been elusive. We used visually driven smooth eye movements to find the 3D space-time function that best predicts both eye movements and perception of translating dot patterns. We found that the visual system does not appear to use all available motion signals uniformly, but rather weights motion preferentially in a narrow band at approximately one-third the radius of the stimulus. Although not universal, the filter predicts responses to other types of stimuli, demonstrating a remarkable degree of generalization that may lead to a deeper understanding of visual motion processing.
尽管人们对运动整合一直兴趣浓厚,但大脑施加于视觉场景的时空滤波器的直接测量方法却难以捉摸。这可能是因为从心理物理学任务中的感知报告来估计三维函数具有挑战性。我们采用了一种不同的方法。我们利用视觉运动估计和平滑跟踪眼动之间的紧密联系,来测量跨空间和时间的刺激-反应相关性,计算人类和猴子中全局运动方向的线性时空滤波器。尽管该滤波器是从眼动推导出来的,但我们发现它能很好地预测感知运动估计。为了区分视觉和运动对跟踪运动滤波器时间持续的贡献,我们记录了猴子颞中皮质区域(MT)的单神经元反应。我们发现跟踪反应延迟与皮质神经元潜伏期的分布一致,并且跟踪的时间运动整合与一个短整合MT亚群一致。值得注意的是,视觉系统似乎在中央凹偏心率的狭窄范围内优先加权运动信号,而不是在整个视野上均匀加权,沿运动方向的位置会有短暂增强的贡献。我们发现视觉系统对落在刺激孔径半径约三分之一处的运动最敏感。假设跟踪的视觉驱动与运动刺激中滤波后的运动能量有关,我们比较了其他几种目标形式下测量和预测的眼加速度。全局运动感知背后的空间和时间处理的紧凑模型一直难以捉摸。我们利用视觉驱动的平滑眼动来找到能最好地预测眼动和平移点模式感知的三维时空函数。我们发现视觉系统似乎并非均匀地使用所有可用的运动信号,而是在刺激半径约三分之一的窄带中优先加权运动。尽管并非普遍适用,但该滤波器能预测对其他类型刺激的反应,显示出显著的泛化程度,这可能会导致对视觉运动处理有更深入的理解。