Zinchenko Oksana, Yaple Zachary A, Arsalidou Marie
Centre for Cognition and Decision Making, Institute for Cognitive Neuroscience, National Research University Higher School of Economics, Moscow, Russia.
Department of Psychology, National University of Singapore, Singapore, Singapore.
Front Hum Neurosci. 2018 Jun 5;12:227. doi: 10.3389/fnhum.2018.00227. eCollection 2018.
Identifying facial expressions is crucial for social interactions. Functional neuroimaging studies show that a set of brain areas, such as the fusiform gyrus and amygdala, become active when viewing emotional facial expressions. The majority of functional magnetic resonance imaging (fMRI) studies investigating face perception typically employ static images of faces. However, studies that use dynamic facial expressions (e.g., videos) are accumulating and suggest that a dynamic presentation may be more sensitive and ecologically valid for investigating faces. By using quantitative fMRI meta-analysis the present study examined concordance of brain regions associated with viewing dynamic facial expressions. We analyzed data from 216 participants that participated in 14 studies, which reported coordinates for 28 experiments. Our analysis revealed bilateral fusiform and middle temporal gyri, left amygdala, left declive of the cerebellum and the right inferior frontal gyrus. These regions are discussed in terms of their relation to models of face processing.
识别面部表情对于社交互动至关重要。功能神经影像学研究表明,当观看情绪化面部表情时,一组脑区(如梭状回和杏仁核)会变得活跃。大多数研究面部感知的功能磁共振成像(fMRI)研究通常采用面部的静态图像。然而,使用动态面部表情(如视频)的研究正在不断积累,并且表明动态呈现对于研究面部可能更敏感且在生态学上更有效。通过使用定量fMRI元分析,本研究检查了与观看动态面部表情相关的脑区的一致性。我们分析了来自216名参与者的数据,这些参与者参与了14项研究,这些研究报告了28项实验的坐标。我们的分析揭示了双侧梭状回和颞中回、左侧杏仁核、小脑左侧斜坡和右侧额下回。将根据这些区域与面部处理模型的关系进行讨论。