Horikawa Tomoyasu, Cowen Alan S, Keltner Dacher, Kamitani Yukiyasu
Department of Neuroinformatics, ATR Computational Neuroscience Laboratories, Hikaridai, Seika, Soraku, Kyoto, 619-0288, Japan.
Department of Psychology, University of California, Berkeley, CA 94720-1500, USA.
iScience. 2020 May 22;23(5):101060. doi: 10.1016/j.isci.2020.101060. Epub 2020 Apr 17.
Central to our subjective lives is the experience of different emotions. Recent behavioral work mapping emotional responses to 2,185 videos found that people experience upward of 27 distinct emotions occupying a high-dimensional space, and that emotion categories, more so than affective dimensions (e.g., valence), organize self-reports of subjective experience. Here, we sought to identify the neural substrates of this high-dimensional space of emotional experience using fMRI responses to all 2,185 videos. Our analyses demonstrated that (1) dozens of video-evoked emotions were accurately predicted from fMRI patterns in multiple brain regions with different regional configurations for individual emotions; (2) emotion categories better predicted cortical and subcortical responses than affective dimensions, outperforming visual and semantic covariates in transmodal regions; and (3) emotion-related fMRI responses had a cluster-like organization efficiently characterized by distinct categories. These results support an emerging theory of the high-dimensional emotion space, illuminating its neural foundations distributed across transmodal regions.
不同情绪的体验是我们主观生活的核心。最近一项针对2185个视频的情绪反应进行映射的行为学研究发现,人们能体验到超过27种不同的情绪,这些情绪占据了一个高维空间,而且相较于情感维度(如效价),情绪类别更能组织主观体验的自我报告。在此,我们试图利用对所有2185个视频的功能磁共振成像(fMRI)反应来确定这种高维情绪体验空间的神经基础。我们的分析表明:(1)通过多个脑区的fMRI模式能准确预测出数十种视频诱发的情绪,且针对个体情绪存在不同的区域配置;(2)相较于情感维度,情绪类别能更好地预测皮质和皮质下反应,在跨模态区域中优于视觉和语义协变量;(3)与情绪相关的fMRI反应具有一种类似聚类的组织,能通过不同类别有效地进行表征。这些结果支持了一种新兴的高维情绪空间理论,阐明了其分布在跨模态区域的神经基础。