Limanowski Jakub, Blankenburg Felix
Neurocomputation and Neuroimaging Unit, Department of Education and Psychology, Freie Universität Berlin, 14195 Berlin, Germany
Neurocomputation and Neuroimaging Unit, Department of Education and Psychology, Freie Universität Berlin, 14195 Berlin, Germany.
J Neurosci. 2016 Mar 2;36(9):2582-9. doi: 10.1523/JNEUROSCI.3987-15.2016.
The brain constructs a flexible representation of the body from multisensory information. Previous work on monkeys suggests that the posterior parietal cortex (PPC) and ventral premotor cortex (PMv) represent the position of the upper limbs based on visual and proprioceptive information. Human experiments on the rubber hand illusion implicate similar regions, but since such experiments rely on additional visuo-tactile interactions, they cannot isolate visuo-proprioceptive integration. Here, we independently manipulated the position (palm or back facing) of passive human participants' unseen arm and of a photorealistic virtual 3D arm. Functional magnetic resonance imaging (fMRI) revealed that matching visual and proprioceptive information about arm position engaged the PPC, PMv, and the body-selective extrastriate body area (EBA); activity in the PMv moreover reflected interindividual differences in congruent arm ownership. Further, the PPC, PMv, and EBA increased their coupling with the primary visual cortex during congruent visuo-proprioceptive position information. These results suggest that human PPC, PMv, and EBA evaluate visual and proprioceptive position information and, under sufficient cross-modal congruence, integrate it into a multisensory representation of the upper limb in space.
The position of our limbs in space constantly changes, yet the brain manages to represent limb position accurately by combining information from vision and proprioception. Electrophysiological recordings in monkeys have revealed neurons in the posterior parietal and premotor cortices that seem to implement and update such a multisensory limb representation, but this has been difficult to demonstrate in humans. Our fMRI experiment shows that human posterior parietal, premotor, and body-selective visual brain areas respond preferentially to a virtual arm seen in a position corresponding to one's unseen hidden arm, while increasing their communication with regions conveying visual information. These brain areas thus likely integrate visual and proprioceptive information into a flexible multisensory body representation.
大脑从多感官信息中构建出灵活的身体表征。先前对猴子的研究表明,后顶叶皮层(PPC)和腹侧运动前皮层(PMv)基于视觉和本体感觉信息来表征上肢的位置。关于橡胶手错觉的人体实验涉及类似区域,但由于此类实验依赖额外的视觉 - 触觉相互作用,所以无法分离视觉 - 本体感觉整合。在此,我们独立操控了被动参与实验的人类受试者不可见手臂以及逼真的虚拟3D手臂的位置(手掌向前或向后)。功能磁共振成像(fMRI)显示,匹配关于手臂位置的视觉和本体感觉信息会激活PPC、PMv以及身体选择性的纹外身体区域(EBA);此外,PMv中的活动反映了在一致的手臂所有权方面的个体差异。此外,在一致的视觉 - 本体感觉位置信息期间,PPC、PMv和EBA与初级视觉皮层的耦合增强。这些结果表明,人类的PPC、PMv和EBA会评估视觉和本体感觉位置信息,并且在足够的跨模态一致性条件下,将其整合为空间中上肢的多感官表征。
我们肢体在空间中的位置不断变化,但大脑通过整合视觉和本体感觉信息,成功精确地表征肢体位置。对猴子的电生理记录揭示了后顶叶和运动前皮层中的神经元似乎在执行和更新这样的多感官肢体表征,但这在人类中很难得到证实。我们的fMRI实验表明,人类的后顶叶、运动前和身体选择性视觉脑区优先对在与受试者不可见的隐藏手臂相对应的位置看到的虚拟手臂做出反应,同时增强它们与传递视觉信息区域的交流。因此,这些脑区可能将视觉和本体感觉信息整合为灵活的多感官身体表征。