Center for Mind/Brain Sciences, University of Trento, Rovereto (TN), Italy.
J Neurosci. 2010 Jul 28;30(30):10127-34. doi: 10.1523/JNEUROSCI.2161-10.2010.
Basic emotional states (such as anger, fear, and joy) can be similarly conveyed by the face, the body, and the voice. Are there human brain regions that represent these emotional mental states regardless of the sensory cues from which they are perceived? To address this question, in the present study participants evaluated the intensity of emotions perceived from face movements, body movements, or vocal intonations, while their brain activity was measured with functional magnetic resonance imaging (fMRI). Using multivoxel pattern analysis, we compared the similarity of response patterns across modalities to test for brain regions in which emotion-specific patterns in one modality (e.g., faces) could predict emotion-specific patterns in another modality (e.g., bodies). A whole-brain searchlight analysis revealed modality-independent but emotion category-specific activity patterns in medial prefrontal cortex (MPFC) and left superior temporal sulcus (STS). Multivoxel patterns in these regions contained information about the category of the perceived emotions (anger, disgust, fear, happiness, sadness) across all modality comparisons (face-body, face-voice, body-voice), and independently of the perceived intensity of the emotions. No systematic emotion-related differences were observed in the overall amplitude of activation in MPFC or STS. These results reveal supramodal representations of emotions in high-level brain areas previously implicated in affective processing, mental state attribution, and theory-of-mind. We suggest that MPFC and STS represent perceived emotions at an abstract, modality-independent level, and thus play a key role in the understanding and categorization of others' emotional mental states.
基本的情绪状态(如愤怒、恐惧和喜悦)可以通过面部、身体和声音同样传达。是否存在代表这些情绪心理状态的人类大脑区域,而不考虑它们被感知的感官线索?为了解决这个问题,在本研究中,参与者评估了从面部运动、身体运动或声音语调中感知到的情绪的强度,同时使用功能磁共振成像 (fMRI) 测量他们的大脑活动。我们使用多体素模式分析比较了跨模态的反应模式的相似性,以测试大脑区域中一种模态(例如面孔)的特定于情绪的模式是否可以预测另一种模态(例如身体)的特定于情绪的模式。全脑搜索灯分析显示,内侧前额叶皮层 (MPFC) 和左侧颞上沟 (STS) 中存在模态独立但情绪类别特异性的活动模式。这些区域中的多体素模式包含了关于感知情绪(愤怒、厌恶、恐惧、快乐、悲伤)的类别信息,跨越了所有模态比较(面孔-身体、面孔-声音、身体-声音),并且与感知情绪的强度无关。在 MPFC 或 STS 中未观察到激活幅度与整体情绪相关的系统差异。这些结果揭示了高水平大脑区域中情绪的超模态表示,这些区域先前涉及情感处理、心理状态归因和心理理论。我们认为,MPFC 和 STS 以抽象的、模态独立的方式代表感知的情绪,因此在理解和分类他人的情绪心理状态方面发挥着关键作用。