LabNeuro - Laboratory of Cognitive Neurophysiology, Department of Physiology, Institute of Biological Sciences, Federal University of Juiz de Fora (UFJF), Juiz de Fora, Minas Gerais, Brazil.
IIT@UniFe Center for Translational Neurophysiology of Speech and Communication, Istituto Italiano di Tecnologia, Via Fossato di Mortara, 17-19, Ferrara, Italy.
Sci Rep. 2019 Aug 23;9(1):12328. doi: 10.1038/s41598-019-48758-1.
There is a current claim that humans are able to effortlessly detect others' hidden mental state by simply observing their movements and transforming the visual input into motor knowledge to predict behaviour. Using a classical paradigm quantifying motor predictions, we tested the role of vision feedback during a reach and load-lifting task performed either alone or with the help of a partner. Wrist flexor and extensor muscle activities were recorded on the supporting hand. Early muscle changes preventing limb instabilities when participants performed the task by themselves revealed the contribution of the visual input in postural anticipation. When the partner performed the unloading, a condition mimicking a split-brain situation, motor prediction followed a pattern evolving along the task course and changing with the integration of successive somatosensory feedback. Our findings demonstrate that during social behaviour, in addition to self-motor representations, individuals cooperate by continuously integrating sensory signals from various sources.
目前有一种观点认为,人类只需通过观察他人的动作,将视觉输入转化为运动知识,就能够轻松地察觉他人隐藏的心理状态,从而预测其行为。我们使用一种经典的量化运动预测的范式,在独自或与伙伴一起完成伸展和举物任务时,测试了视觉反馈在其中的作用。我们记录了支撑手上的腕屈肌和腕伸肌的活动。当参与者独自完成任务时,早期的肌肉变化会防止肢体不稳定,这揭示了视觉输入在姿势预期中的贡献。当伙伴进行卸载(模拟裂脑情况的条件)时,运动预测会遵循一种随着任务进程发展的模式,并随着连续的体感反馈的整合而变化。我们的研究结果表明,在社会行为中,除了自我运动的表现,个体还通过不断整合来自不同来源的感觉信号进行合作。