Suppr超能文献

情绪强度调节双峰愤怒表情的整合:事件相关电位证据。

Emotional Intensity Modulates the Integration of Bimodal Angry Expressions: ERP Evidence.

作者信息

Pan Zhihui, Liu Xi, Luo Yangmei, Chen Xuhai

机构信息

Key Laboratory of Behavior and Cognitive Psychology in Shaanxi Province, School of Psychology, Shaanxi Normal UniversityXi'an, China.

State Key Laboratory of Cognitive Neuroscience and Learning and IDG/McGovern Institute for Brain Research, School of Brain Cognitive Science, Beijing Normal UniversityBeijing, China.

出版信息

Front Neurosci. 2017 Jun 21;11:349. doi: 10.3389/fnins.2017.00349. eCollection 2017.

Abstract

Integration of information from face and voice plays a central role in social interactions. The present study investigated the modulation of emotional intensity on the integration of facial-vocal emotional cues by recording EEG for participants while they were performing emotion identification task on facial, vocal, and bimodal angry expressions varying in emotional intensity. Behavioral results showed the rates of anger and reaction speed increased as emotional intensity across modalities. Critically, the P2 amplitudes were larger for bimodal expressions than for the sum of facial and vocal expressions for low emotional intensity stimuli, but not for middle and high emotional intensity stimuli. These findings suggested that emotional intensity modulates the integration of facial-vocal angry expressions, following the principle of Inverse Effectiveness (IE) in multimodal sensory integration.

摘要

面部和声音信息的整合在社交互动中起着核心作用。本研究通过在参与者对面部、声音和情绪强度不同的双模式愤怒表情执行情绪识别任务时记录脑电图,来探究情绪强度对面部-声音情绪线索整合的调节作用。行为结果表明,随着各模式下情绪强度的增加,愤怒识别率和反应速度也随之提高。关键的是,对于低情绪强度刺激,双模式表情的P2波幅大于面部和声音表情之和的P2波幅,但对于中高情绪强度刺激则不然。这些发现表明,情绪强度遵循多模态感官整合中的反向有效性(IE)原则,对面部-声音愤怒表情的整合产生调节作用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8b45/5478688/f8e4484a49b3/fnins-11-00349-g0001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验