Institut de Recherche en Sciences Psychologiques (IPSY) and Institute of Neuroscience (IoNS), Louvain Bionics, Crossmodal Perception and Plasticity Lab, Université catholique de Louvain (UCL), 1348 Louvain-la-Neuve, Belgium.
Center for Mind/Brain Sciences (CIMeC), University of Trento, 38068 Rovereto, Italy.
Curr Biol. 2020 Jun 22;30(12):2289-2299.e8. doi: 10.1016/j.cub.2020.04.039. Epub 2020 May 21.
The human occipito-temporal region hMT/V5 is well known for processing visual motion direction. Here, we demonstrate that hMT/V5 also represents the direction of auditory motion in a format partially aligned with the one used to code visual motion. We show that auditory and visual motion directions can be reliably decoded in individually localized hMT/V5 and that motion directions in one modality can be predicted from the activity patterns elicited by the other modality. Despite shared motion-direction information across the senses, vision and audition, however, overall produce opposite voxel-wise responses in hMT/V5. Our results reveal a multifaced representation of multisensory motion signals in hMT/V5 and have broader implications for our understanding of how we consider the division of sensory labor between brain regions dedicated to a specific perceptual function.
人类枕颞区 hMT/V5 以处理视觉运动方向而闻名。在这里,我们证明 hMT/V5 也代表了听觉运动的方向,其格式部分与用于编码视觉运动的格式对齐。我们表明,可以在个体局部的 hMT/V5 中可靠地解码听觉和视觉运动方向,并且一种感觉模式的运动方向可以从另一种感觉模式引起的活动模式中预测出来。然而,尽管视觉和听觉在感觉上共享运动方向信息,但总体上在 hMT/V5 中产生相反的体素响应。我们的结果揭示了 hMT/V5 中多感觉运动信号的多方面表示,并对我们理解大脑中专门用于特定感知功能的区域之间的感觉劳动分工有更广泛的影响。