• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

多模态人类交流——针对面部表情、言语内容和韵律。

Multimodal human communication--targeting facial expressions, speech content and prosody.

机构信息

Department of Psychiatry, Psychotherapy, and Psychosomatics, Medical School, RWTH Aachen University, Aachen, Germany.

出版信息

Neuroimage. 2012 May 1;60(4):2346-56. doi: 10.1016/j.neuroimage.2012.02.043. Epub 2012 Feb 28.

DOI:10.1016/j.neuroimage.2012.02.043
PMID:22487549
Abstract

Human communication is based on a dynamic information exchange of the communication channels facial expressions, prosody, and speech content. This fMRI study elucidated the impact of multimodal emotion processing and the specific contribution of each channel on behavioral empathy and its prerequisites. Ninety-six video clips displaying actors who told self-related stories were presented to 27 healthy participants. In two conditions, all channels uniformly transported only emotional or neutral information. Three conditions selectively presented two emotional channels and one neutral channel. Subjects indicated the actors' emotional valence and their own while fMRI was recorded. Activation patterns of tri-channel emotional communication reflected multimodal processing and facilitative effects for empathy. Accordingly, subjects' behavioral empathy rates significantly deteriorated once one source was neutral. However, emotionality expressed via two of three channels yielded activation in a network associated with theory-of-mind-processes. This suggested participants' effort to infer mental states of their counterparts and was accompanied by a decline of behavioral empathy, driven by the participants' emotional responses. Channel-specific emotional contributions were present in modality-specific areas. The identification of different network-nodes associated with human interactions constitutes a prerequisite for understanding dynamics that underlie multimodal integration and explain the observed decline in empathy rates. This task might also shed light on behavioral deficits and neural changes that accompany psychiatric diseases.

摘要

人类交流基于交流渠道的面部表情、韵律和言语内容的动态信息交换。这项 fMRI 研究阐明了多模态情绪处理的影响,以及每个通道对行为同理心及其前提条件的具体贡献。96 个视频片段展示了讲述自我相关故事的演员,共有 27 名健康参与者观看。在两种条件下,所有通道都均匀地传输仅具有情感或中性信息。三种条件选择性地呈现两个情感通道和一个中性通道。当记录 fMRI 时,参与者指出演员的情绪效价和他们自己的情绪效价。三通道情感交流的激活模式反映了多模态处理和同理心的促进作用。因此,一旦一个来源是中性的,参与者的行为同理心率就会显著下降。然而,通过三个通道中的两个表达的情感表达引发了与心理理论过程相关的网络的激活。这表明参与者努力推断对方的心理状态,并且伴随着行为同理心的下降,这是由参与者的情绪反应驱动的。通道特异性情感贡献存在于模态特异性区域中。识别与人际互动相关的不同网络节点是理解多模态整合基础的动态以及解释观察到的同理心率下降的前提。这项任务还可能揭示伴随精神疾病的行为缺陷和神经变化。

相似文献

1
Multimodal human communication--targeting facial expressions, speech content and prosody.多模态人类交流——针对面部表情、言语内容和韵律。
Neuroimage. 2012 May 1;60(4):2346-56. doi: 10.1016/j.neuroimage.2012.02.043. Epub 2012 Feb 28.
2
The differential contribution of facial expressions, prosody, and speech content to empathy.面部表情、韵律和言语内容对共情的差异贡献。
Cogn Emot. 2012;26(6):995-1014. doi: 10.1080/02699931.2011.631296. Epub 2012 Jan 3.
3
Emotions in motion: dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations.动态情绪:与厌恶和快乐静态面部表情相比,动态表情揭示了更广泛的特定情绪激活。
Brain Res. 2009 Aug 11;1284:100-15. doi: 10.1016/j.brainres.2009.05.075. Epub 2009 Jun 6.
4
Integration of cross-modal emotional information in the human brain: an fMRI study.跨模态情感信息在人类大脑中的整合:一项 fMRI 研究。
Cortex. 2010 Feb;46(2):161-9. doi: 10.1016/j.cortex.2008.06.008. Epub 2008 Jun 29.
5
Emotional and cognitive aspects of empathy and their relation to social cognition--an fMRI-study.共情的情感和认知方面及其与社会认知的关系——一项 fMRI 研究。
Brain Res. 2010 Jan 22;1311:110-20. doi: 10.1016/j.brainres.2009.11.043. Epub 2009 Nov 26.
6
The functional correlates of face perception and recognition of emotional facial expressions as evidenced by fMRI.功能磁共振成像研究面部感知和识别情绪面部表情的功能相关性。
Brain Res. 2011 Jun 1;1393:73-83. doi: 10.1016/j.brainres.2011.04.007. Epub 2011 Apr 9.
7
FMRI study of emotional speech comprehension.情绪性言语理解的功能磁共振成像研究
Cereb Cortex. 2007 Feb;17(2):339-52. doi: 10.1093/cercor/bhj151. Epub 2006 Mar 8.
8
Mirror neuron and theory of mind mechanisms involved in face-to-face interactions: a functional magnetic resonance imaging approach to empathy.参与面对面互动的镜像神经元与心理理论机制:共情的功能磁共振成像研究方法
J Cogn Neurosci. 2007 Aug;19(8):1354-72. doi: 10.1162/jocn.2007.19.8.1354.
9
On emotional conflict: interference resolution of happy and angry prosody reveals valence-specific effects.论情绪冲突:快乐和愤怒韵律的干扰解决揭示了效价特异性效应。
Cereb Cortex. 2010 Feb;20(2):383-92. doi: 10.1093/cercor/bhp106. Epub 2009 Jun 8.
10
The neural mechanism of imagining facial affective expression.想象面部情感表情的神经机制。
Brain Res. 2007 May 11;1145:128-37. doi: 10.1016/j.brainres.2006.12.048. Epub 2006 Dec 22.

引用本文的文献

1
The impact of visual information in speech perception for individuals with hearing loss: a mini review.视觉信息对听力损失个体言语感知的影响:一项小型综述。
Front Psychol. 2024 Sep 24;15:1399084. doi: 10.3389/fpsyg.2024.1399084. eCollection 2024.
2
TMS disruption of the lateral prefrontal cortex increases neural activity in the default mode network when naming facial expressions.经颅磁刺激(TMS)对侧前额叶皮层的干扰会在命名面部表情时增加默认模式网络的神经活动。
Soc Cogn Affect Neurosci. 2023 Nov 30;18(1). doi: 10.1093/scan/nsad072.
3
Somatic engagement alters subsequent neurobehavioral correlates of affective mentalizing.
躯体参与改变了随后情感心理化的神经行为相关性。
Hum Brain Mapp. 2021 Dec 15;42(18):5846-5861. doi: 10.1002/hbm.25640. Epub 2021 Oct 14.
4
Robust Multimodal Emotion Recognition from Conversation with Transformer-Based Crossmodality Fusion.基于 Transformer 的跨模态融合的对话中的稳健多模态情感识别。
Sensors (Basel). 2021 Jul 19;21(14):4913. doi: 10.3390/s21144913.
5
Subjective emotional arousal: an explorative study on the role of gender, age, intensity, emotion regulation difficulties, depression and anxiety symptoms, and meta-emotion.主观情绪唤醒:性别、年龄、强度、情绪调节困难、抑郁和焦虑症状以及元情绪作用的探索性研究。
Psychol Res. 2020 Oct;84(7):1857-1876. doi: 10.1007/s00426-019-01197-z. Epub 2019 May 16.
6
The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English.瑞尔森情感语音和歌曲音频视频数据库(RAVDESS):一组具有北美英语特色的动态、多模态面部和声音表情数据集。
PLoS One. 2018 May 16;13(5):e0196391. doi: 10.1371/journal.pone.0196391. eCollection 2018.
7
Deficits in Response Inhibition in Patients with Attention-Deficit/Hyperactivity Disorder: The Impaired Self-Protection System Hypothesis.注意力缺陷多动障碍患者反应抑制功能缺陷:自我保护系统受损假说
Front Psychiatry. 2018 Jan 22;8:299. doi: 10.3389/fpsyt.2017.00299. eCollection 2017.
8
Neural measures of the role of affective prosody in empathy for pain.神经测量在共情疼痛中的情感韵律作用。
Sci Rep. 2018 Jan 10;8(1):291. doi: 10.1038/s41598-017-18552-y.
9
Exploring the Neural Basis of Avatar Identification in Pathological Internet Gamers and of Self-Reflection in Pathological Social Network Users.探究病理性网络游戏玩家的虚拟形象识别及病理性社交网络用户的自我反思的神经基础。
J Behav Addict. 2016 Sep;5(3):485-99. doi: 10.1556/2006.5.2016.048. Epub 2016 Jul 14.
10
Doing Duo - a case study of entrainment in William Forsythe's choreography "Duo".《双人舞》——对威廉·福赛斯舞蹈编排作品《双人舞》中节奏同步的案例研究
Front Hum Neurosci. 2014 Oct 21;8:812. doi: 10.3389/fnhum.2014.00812. eCollection 2014.