Gerdes Antje B M, Wieser Matthias J, Alpers Georg W
Clinical and Biological Psychology, Department of Psychology, School of Social Sciences, University of Mannheim Mannheim, Germany.
Department of Psychology, University of Würzburg Würzburg, Germany.
Front Psychol. 2014 Dec 1;5:1351. doi: 10.3389/fpsyg.2014.01351. eCollection 2014.
In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel may alter processing in another channel. For example, seeing an emotional facial expression and hearing the voice's emotional tone will jointly create the emotional experience. This example, where auditory and visual input is related to social communication, has gained considerable attention by researchers. However, interactions of visual and auditory emotional information are not limited to social communication but can extend to much broader contexts including human, animal, and environmental cues. In this article, we review current research on audiovisual emotion processing beyond face-voice stimuli to develop a broader perspective on multimodal interactions in emotion processing. We argue that current concepts of multimodality should be extended in considering an ecologically valid variety of stimuli in audiovisual emotion processing. Therefore, we provide an overview of studies in which emotional sounds and interactions with complex pictures of scenes were investigated. In addition to behavioral studies, we focus on neuroimaging, electro- and peripher-physiological findings. Furthermore, we integrate these findings and identify similarities or differences. We conclude with suggestions for future research.
在日常生活中,多种感官通道共同引发情感体验,且一个通道可能会改变另一个通道的处理过程。例如,看到一张带有情感的面部表情并听到带有情感的语音语调会共同营造出情感体验。这个例子中,听觉和视觉输入与社交沟通相关,已经引起了研究人员的广泛关注。然而,视觉和听觉情感信息的交互并不局限于社交沟通,而是可以扩展到更广泛的情境中,包括人类、动物和环境线索。在本文中,我们回顾了当前关于视听情感处理的研究,这些研究超越了面部-语音刺激,以便对情感处理中的多模态交互形成更广阔的视角。我们认为,在考虑视听情感处理中生态有效且多样的刺激时,当前的多模态概念应该得到扩展。因此,我们概述了对情感声音以及与复杂场景图片交互的研究。除了行为研究,我们还关注神经成像、电生理和外周生理方面的研究结果。此外,我们整合这些结果并找出异同之处。最后,我们给出了对未来研究的建议。