Morningstar Michele
Department of Psychology, Queen's University, Kingston, Canada.
Centre for Neuroscience Studies, Queen's University, Kingston, Canada.
Affect Sci. 2024 Aug 26;5(3):201-208. doi: 10.1007/s42761-024-00265-x. eCollection 2024 Sep.
Affective science has increasingly sought to represent emotional experiences multimodally, measuring affect through a combination of self-report ratings, linguistic output, physiological measures, and/or nonverbal expressions. However, despite widespread recognition that non-facial nonverbal cues are an important facet of expressive behavior, measures of nonverbal expressions commonly focus solely on facial movements. This Commentary represents a call for affective scientists to integrate a larger range of nonverbal cues-including gestures, postures, and vocal cues-alongside facial cues in efforts to represent the experience of emotion and its communication. Using the measurement and analysis of vocal cues as an illustrative case, the Commentary considers challenges, potential solutions, and the theoretical and translational significance of working to integrate multiple nonverbal channels in the study of affect.
情感科学越来越多地寻求以多模态方式呈现情感体验,通过自我报告评分、语言输出、生理测量和/或非语言表达的组合来测量情感。然而,尽管人们普遍认识到非面部非语言线索是表达行为的一个重要方面,但非语言表达的测量通常只关注面部动作。本评论呼吁情感科学家将更广泛的非语言线索——包括手势、姿势和声音线索——与面部线索结合起来,以呈现情感体验及其交流。本评论以声音线索的测量和分析为例,探讨了在情感研究中整合多种非语言渠道所面临的挑战、潜在解决方案以及理论和转化意义。