Cao Linjing, Xu Junhai, Yang Xiaoli, Li Xianglin, Liu Baolin
School of Computer Science and Technology, Tianjin Key Laboratory of Cognitive Computing and Application, Tianjin University, Tianjin, China.
Medical Imaging Research Institute, Binzhou Medical University, Yantai, China.
Front Hum Neurosci. 2018 Oct 18;12:419. doi: 10.3389/fnhum.2018.00419. eCollection 2018.
Emotions can be perceived through the face, body, and whole-person, while previous studies on the abstract representations of emotions only focused on the emotions of the face and body. It remains unclear whether emotions can be represented at an abstract level regardless of all three sensory cues in specific brain regions. In this study, we used the representational similarity analysis (RSA) to explore the hypothesis that the emotion category is independent of all three stimulus types and can be decoded based on the activity patterns elicited by different emotions. Functional magnetic resonance imaging (fMRI) data were collected when participants classified emotions (angry, fearful, and happy) expressed by videos of faces, bodies, and whole-persons. An abstract emotion model was defined to estimate the neural representational structure in the whole-brain RSA, which assumed that the neural patterns were significantly correlated in within-emotion conditions ignoring the stimulus types but uncorrelated in between-emotion conditions. A neural representational dissimilarity matrix (RDM) for each voxel was then compared to the abstract emotion model to examine whether specific clusters could identify the abstract representation of emotions that generalized across stimulus types. The significantly positive correlations between neural RDMs and models suggested that the abstract representation of emotions could be successfully captured by the representational space of specific clusters. The whole-brain RSA revealed an emotion-specific but stimulus category-independent neural representation in the left postcentral gyrus, left inferior parietal lobe (IPL) and right superior temporal sulcus (STS). Further cluster-based MVPA revealed that only the left postcentral gyrus could successfully distinguish three types of emotions for the two stimulus type pairs (face-body and body-whole person) and happy versus angry/fearful, which could be considered as positive versus negative for three stimulus type pairs, when the cross-modal classification analysis was performed. Our study suggested that abstract representations of three emotions (angry, fearful, and happy) could extend from the face and body stimuli to whole-person stimuli and the findings of this study provide support for abstract representations of emotions in the left postcentral gyrus.
情绪可以通过面部、身体和整个人体被感知,而以往关于情绪抽象表征的研究仅聚焦于面部和身体的情绪。目前尚不清楚在特定脑区中,情绪是否能够在不依赖所有三种感官线索的情况下以抽象水平进行表征。在本研究中,我们使用表征相似性分析(RSA)来探究这一假设,即情绪类别独立于所有三种刺激类型,并且可以基于不同情绪引发的活动模式进行解码。当参与者对由面部、身体和整个人体视频所表达的情绪(愤怒、恐惧和快乐)进行分类时,收集了功能磁共振成像(fMRI)数据。定义了一个抽象情绪模型来估计全脑RSA中的神经表征结构,该模型假设在忽略刺激类型的情绪内条件下神经模式显著相关,而在情绪间条件下不相关。然后将每个体素的神经表征差异矩阵(RDM)与抽象情绪模型进行比较,以检查特定簇是否能够识别跨刺激类型的情绪抽象表征。神经RDM与模型之间的显著正相关表明,特定簇的表征空间能够成功捕捉情绪的抽象表征。全脑RSA揭示了在左侧中央后回、左侧顶下小叶(IPL)和右侧颞上沟(STS)中存在特定于情绪但独立于刺激类别的神经表征。进一步基于簇的多变量模式分析(MVPA)表明,当进行跨模态分类分析时,只有左侧中央后回能够成功区分两种刺激类型对(面部 - 身体和身体 - 整个人体)以及快乐与愤怒/恐惧这三种情绪类型,对于三种刺激类型对而言,这可以被视为积极与消极情绪。我们的研究表明,三种情绪(愤怒、恐惧和快乐)的抽象表征可以从面部和身体刺激扩展到整个人体刺激,并且本研究结果为左侧中央后回中情绪的抽象表征提供了支持。