Suppr超能文献

面部表情处理的共享神经动力学。

Shared neural dynamics of facial expression processing.

作者信息

Ely Madeline Molly, Ambrus Géza Gergely

机构信息

Department of Psychology, Bournemouth University, Poole House P319, Talbot Campus, Fern Barrow, Poole, Dorset BH12 5BB UK.

出版信息

Cogn Neurodyn. 2025 Dec;19(1):45. doi: 10.1007/s11571-025-10230-4. Epub 2025 Mar 4.

Abstract

UNLABELLED

The ability to recognize and interpret facial expressions is fundamental to human social cognition, enabling navigation of complex interpersonal interactions and understanding of others' emotional states. The extent to which neural patterns associated with facial expression processing are shared between observers remains unexplored, and no study has yet examined the neural dynamics specific to different emotional expressions. Additionally, the neural processing dynamics of facial attributes such as sex and identity in relation to facial expressions have not been thoroughly investigated. In this study, we investigated the shared neural dynamics of emotional face processing using an explicit facial emotion recognition task, where participants made two-alternative forced choice (2AFC) decisions on the displayed emotion. Our data-driven approach employed cross-participant multivariate classification and representational dissimilarity analysis on EEG data. The results demonstrate that EEG signals can effectively decode the sex, emotional expression, and identity of face stimuli across different stimuli and participants, indicating shared neural codes for facial expression processing. Multivariate classification analyses revealed that sex is decoded first, followed by identity, and then emotion. Emotional expressions (angry, happy, sad) were decoded earlier when contrasted with neutral expressions. While identity and sex information were modulated by image-level stimulus features, the effects of emotion were independent of visual image properties. Importantly, our findings suggest enhanced processing of face identity and sex for emotional expressions, particularly for angry faces and, to a lesser extent, happy faces.

SUPPLEMENTARY INFORMATION

The online version contains supplementary material available at 10.1007/s11571-025-10230-4.

摘要

未标注

识别和解读面部表情的能力是人类社会认知的基础,有助于在复杂的人际互动中导航并理解他人的情绪状态。观察者之间与面部表情处理相关的神经模式共享程度尚未得到探索,且尚无研究考察不同情绪表情特有的神经动力学。此外,面部属性(如性别和身份)相对于面部表情的神经处理动力学也未得到充分研究。在本研究中,我们使用明确的面部情绪识别任务来探究情绪面部处理的共享神经动力学,参与者对显示的情绪做出二选一强制选择(2AFC)决策。我们的数据驱动方法对脑电图(EEG)数据采用跨参与者多变量分类和表征差异分析。结果表明,EEG信号可以有效地解码不同刺激和参与者对面部刺激的性别、情绪表情和身份,表明面部表情处理存在共享神经编码。多变量分类分析显示,首先解码性别,其次是身份,然后是情绪。与中性表情相比,情绪表情(愤怒、快乐、悲伤)的解码更早。虽然身份和性别信息受图像级刺激特征的调制,但情绪的影响与视觉图像属性无关。重要的是,我们的研究结果表明,对于情绪表情,尤其是愤怒的面孔,以及在较小程度上快乐的面孔,面部身份和性别的处理得到增强。

补充信息

在线版本包含可在10.1007/s11571-025-10230-4获取的补充材料。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验