Suppr超能文献

合作与情感人际互动的神经编码:150 毫秒即可对动作目的进行编码。

Neural coding of cooperative vs. affective human interactions: 150 ms to code the action's purpose.

机构信息

Department of Psychology, University of Milano-Bicocca, Milan, Italy.

出版信息

PLoS One. 2011;6(7):e22026. doi: 10.1371/journal.pone.0022026. Epub 2011 Jul 7.

Abstract

The timing and neural processing of the understanding of social interactions was investigated by presenting scenes in which 2 people performed cooperative or affective actions. While the role of the human mirror neuron system (MNS) in understanding actions and intentions is widely accepted, little is known about the time course within which these aspects of visual information are automatically extracted. Event-Related Potentials were recorded in 35 university students perceiving 260 pictures of cooperative (e.g., 2 people dragging a box) or affective (e.g., 2 people smiling and holding hands) interactions. The action's goal was automatically discriminated at about 150-170 ms, as reflected by occipito/temporal N170 response. The swLORETA inverse solution revealed the strongest sources in the right posterior cingulate cortex (CC) for affective actions and in the right pSTS for cooperative actions. It was found a right hemispheric asymmetry that involved the fusiform gyrus (BA37), the posterior CC, and the medial frontal gyrus (BA10/11) for the processing of affective interactions, particularly in the 155-175 ms time window. In a later time window (200-250 ms) the processing of cooperative interactions activated the left post-central gyrus (BA3), the left parahippocampal gyrus, the left superior frontal gyrus (BA10), as well as the right premotor cortex (BA6). Women showed a greater response discriminative of the action's goal compared to men at P300 and anterior negativity level (220-500 ms). These findings might be related to a greater responsiveness of the female vs. male MNS. In addition, the discriminative effect was bilateral in women and was smaller and left-sided in men. Evidence was provided that perceptually similar social interactions are discriminated on the basis of the agents' intentions quite early in neural processing, differentially activating regions devoted to face/body/action coding, the limbic system and the MNS.

摘要

我们通过呈现两个人执行合作或情感动作的场景来研究理解社会互动的时间和神经处理。虽然人类镜像神经元系统(MNS)在理解动作和意图方面的作用得到广泛认可,但对于这些视觉信息方面的自动提取时间过程知之甚少。我们在 35 名大学生中记录了事件相关电位,让他们感知 260 张合作(例如,两个人拖着一个盒子)或情感(例如,两个人微笑着牵手)互动的图片。动作的目标在大约 150-170ms 时自动被区分,这反映在枕颞 N170 反应中。swLORETA 反演解揭示了右侧后扣带回皮层(CC)对情感动作最强的源,以及右侧 pSTS 对合作动作最强的源。发现了一种右半球不对称性,涉及到处理情感互动时的梭状回(BA37)、后扣带皮层(CC)和内侧额回(BA10/11),特别是在 155-175ms 的时间窗口内。在稍后的时间窗口(200-250ms),合作互动的处理激活了左后中央 gyrus(BA3)、左旁海马回、左额上回(BA10),以及右运动前皮层(BA6)。与男性相比,女性在 P300 和前负性水平(220-500ms)上对动作目标的反应具有更大的可区分性。这些发现可能与女性 MNS 的反应性大于男性有关。此外,女性的辨别效应是双侧的,而男性的辨别效应较小且偏向左侧。有证据表明,在神经处理过程中,基于主体意图,感知相似的社会互动会被区分开来,这会分别激活专门用于面部/身体/动作编码、边缘系统和 MNS 的区域。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7157/3131384/e532c68f0e5b/pone.0022026.g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验