Suppr超能文献

解读我的面部表情:情绪效价与唤醒的自动面部编码与心理生理指标

Read My Face: Automatic Facial Coding Versus Psychophysiological Indicators of Emotional Valence and Arousal.

作者信息

Höfling T Tim A, Gerdes Antje B M, Föhl Ulrich, Alpers Georg W

机构信息

Department of Psychology, School of Social Sciences, University of Mannheim, Mannheim, Germany.

Business Unit, Pforzheim University of Applied Sciences, Pforzheim, Germany.

出版信息

Front Psychol. 2020 Jun 19;11:1388. doi: 10.3389/fpsyg.2020.01388. eCollection 2020.

Abstract

Facial expressions provide insight into a person's emotional experience. To automatically decode these expressions has been made possible by tremendous progress in the field of computer vision. Researchers are now able to decode emotional facial expressions with impressive accuracy in standardized images of prototypical basic emotions. We tested the sensitivity of a well-established automatic facial coding software program to detect spontaneous emotional reactions in individuals responding to emotional pictures. We compared automatically generated scores for valence and arousal of the Facereader (FR; Noldus Information Technology) with the current psychophysiological gold standard of measuring emotional valence (Facial Electromyography, EMG) and arousal (Skin Conductance, SC). We recorded physiological and behavioral measurements of 43 healthy participants while they looked at pleasant, unpleasant, or neutral scenes. When viewing pleasant pictures, FR Valence and EMG were both comparably sensitive. However, for unpleasant pictures, FR Valence showed an expected negative shift, but the signal differentiated not well between responses to neutral and unpleasant stimuli, that were distinguishable with EMG. Furthermore, FR Arousal values had a stronger correlation with self-reported valence than with arousal while SC was sensitive and specifically associated with self-reported arousal. This is the first study to systematically compare FR measurement of spontaneous emotional reactions to standardized emotional images with established psychophysiological measurement tools. This novel technology has yet to make strides to surpass the sensitivity of established psychophysiological measures. However, it provides a promising new measurement technique for non-contact assessment of emotional responses.

摘要

面部表情能让人洞察一个人的情感体验。计算机视觉领域的巨大进步使得自动解码这些表情成为可能。如今,研究人员能够在典型基本情绪的标准化图像中以令人印象深刻的准确率解码情感面部表情。我们测试了一款成熟的自动面部编码软件程序检测个体对情感图片做出的自发情感反应的敏感度。我们将自动生成的面部阅读器(FR;诺德斯信息技术公司)效价和唤醒度得分与测量情感效价(面部肌电图,EMG)和唤醒度(皮肤电传导,SC)的当前心理生理学黄金标准进行了比较。我们记录了43名健康参与者观看愉悦、不悦或中性场景时的生理和行为测量数据。观看愉悦图片时,FR效价和EMG的敏感度相当。然而,对于不悦图片,FR效价出现了预期的负向变化,但该信号在区分对中性和不悦刺激的反应方面表现不佳,而EMG能够区分这两种反应。此外,FR唤醒度得分与自我报告的效价比与唤醒度的相关性更强,而SC很敏感,且与自我报告的唤醒度有特定关联。这是第一项系统比较FR对标准化情感图像的自发情感反应测量与既定心理生理学测量工具的研究。这项新技术尚未取得进展以超越既定心理生理学测量方法的敏感度。然而,它为情感反应的非接触式评估提供了一种有前景的新测量技术。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/500a/7316962/af9a74d0b38a/fpsyg-11-01388-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验