• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

自动面部表情识别中的跨种族效应违反了测量不变性。

The cross-race effect in automatic facial expression recognition violates measurement invariance.

作者信息

Li Yen-Ting, Yeh Su-Ling, Huang Tsung-Ren

机构信息

Department of Psychology, National Taiwan University, Taipei City, Taiwan.

Graduate Institute of Brain and Mind Sciences, National Taiwan University, Taipei City, Taiwan.

出版信息

Front Psychol. 2023 Dec 7;14:1201145. doi: 10.3389/fpsyg.2023.1201145. eCollection 2023.

DOI:10.3389/fpsyg.2023.1201145
PMID:38130968
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10733503/
Abstract

Emotion has been a subject undergoing intensive research in psychology and cognitive neuroscience over several decades. Recently, more and more studies of emotion have adopted automatic rather than manual methods of facial emotion recognition to analyze images or videos of human faces. Compared to manual methods, these computer-vision-based, automatic methods can help objectively and rapidly analyze a large amount of data. These automatic methods have also been validated and believed to be accurate in their judgments. However, these automatic methods often rely on statistical learning models (e.g., deep neural networks), which are intrinsically inductive and thus suffer from problems of induction. Specifically, the models that were trained primarily on Western faces may not generalize well to accurately judge Eastern faces, which can then jeopardize the measurement invariance of emotions in cross-cultural studies. To demonstrate such a possibility, the present study carries out a cross-racial validation of two popular facial emotion recognition systems-FaceReader and DeepFace-using two Western and two Eastern face datasets. Although both systems could achieve overall high accuracies in the judgments of emotion category on the Western datasets, they performed relatively poorly on the Eastern datasets, especially in recognition of negative emotions. While these results caution the use of these automatic methods of emotion recognition on non-Western faces, the results also suggest that the measurements of happiness outputted by these automatic methods are accurate and invariant across races and hence can still be utilized for cross-cultural studies of positive psychology.

摘要

几十年来,情感一直是心理学和认知神经科学领域深入研究的课题。最近,越来越多的情感研究采用自动而非人工的面部情感识别方法来分析人脸图像或视频。与人工方法相比,这些基于计算机视觉的自动方法有助于客观、快速地分析大量数据。这些自动方法也经过了验证,并且被认为判断准确。然而,这些自动方法通常依赖于统计学习模型(例如深度神经网络),而这些模型本质上是归纳性的,因此存在归纳问题。具体而言,主要基于西方人脸训练的模型可能无法很好地推广到准确判断东方人脸,这可能会危及跨文化研究中情感测量的不变性。为了证明这种可能性,本研究使用两个西方人脸数据集和两个东方人脸数据集,对两个流行的面部情感识别系统——FaceReader和DeepFace——进行了跨种族验证。虽然这两个系统在西方数据集的情感类别判断上都能达到总体较高的准确率,但它们在东方数据集上的表现相对较差,尤其是在识别负面情绪方面。这些结果提醒人们在非西方人脸的情感识别中要谨慎使用这些自动方法,同时也表明这些自动方法输出的幸福度测量在不同种族间是准确且不变的,因此仍可用于积极心理学的跨文化研究。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f5f/10733503/c82de80b1274/fpsyg-14-1201145-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f5f/10733503/cf2113f67cb1/fpsyg-14-1201145-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f5f/10733503/c82de80b1274/fpsyg-14-1201145-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f5f/10733503/cf2113f67cb1/fpsyg-14-1201145-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4f5f/10733503/c82de80b1274/fpsyg-14-1201145-g002.jpg

相似文献

1
The cross-race effect in automatic facial expression recognition violates measurement invariance.自动面部表情识别中的跨种族效应违反了测量不变性。
Front Psychol. 2023 Dec 7;14:1201145. doi: 10.3389/fpsyg.2023.1201145. eCollection 2023.
2
Children's recognition of emotion expressed by own-race versus other-race faces.儿童对自身种族和其他种族面孔所表达情绪的识别。
J Exp Child Psychol. 2019 Jun;182:102-113. doi: 10.1016/j.jecp.2019.01.009. Epub 2019 Feb 25.
3
The emotion-facial expression link: evidence from human and automatic expression recognition.情绪与面部表情的联系:来自人类及自动表情识别的证据。
Psychol Res. 2021 Nov;85(8):2954-2969. doi: 10.1007/s00426-020-01448-4. Epub 2020 Nov 24.
4
Automatic Facial Expression Recognition in Standardized and Non-standardized Emotional Expressions.标准化和非标准化情绪表达中的自动面部表情识别
Front Psychol. 2021 May 5;12:627561. doi: 10.3389/fpsyg.2021.627561. eCollection 2021.
5
The Influence of Anxiety on the Recognition of Facial Emotion Depends on the Emotion Category and Race of the Target Faces.焦虑对面部表情识别的影响取决于目标面孔的情绪类别和种族。
Exp Neurobiol. 2019 Apr;28(2):261-269. doi: 10.5607/en.2019.28.2.261. Epub 2019 Apr 30.
6
An automatic improved facial expression recognition for masked faces.一种针对蒙面人脸的自动改进面部表情识别方法。
Neural Comput Appl. 2023;35(20):14963-14972. doi: 10.1007/s00521-023-08498-w. Epub 2023 Apr 1.
7
Differential emotion attribution to neutral faces of own and other races.对本族和其他种族中性面孔的情绪差异归因
Cogn Emot. 2017 Feb;31(2):360-368. doi: 10.1080/02699931.2015.1092419. Epub 2015 Oct 14.
8
Emotion recognition from faces with in- and out-group features in patients with depression.抑郁症患者对具有内群体和外群体特征面孔的情绪识别。
J Affect Disord. 2018 Feb;227:817-823. doi: 10.1016/j.jad.2017.11.085. Epub 2017 Nov 21.
9
Is facial emotion recognition impairment in schizophrenia identical for different emotions? A signal detection analysis.精神分裂症患者对不同情绪的面部表情识别障碍是否相同?一项信号检测分析。
Schizophr Res. 2008 Feb;99(1-3):263-9. doi: 10.1016/j.schres.2007.11.006. Epub 2008 Jan 3.
10
Impaired Recognition of Facial and Vocal Emotions in Mild Cognitive Impairment.轻度认知障碍患者对面部和声音情绪识别能力受损。
J Int Neuropsychol Soc. 2022 Jan;28(1):48-61. doi: 10.1017/S135561772100014X. Epub 2021 Mar 4.

引用本文的文献

1
The nonverbal expression of guilt in healthy adults.健康成年人的内疚的非言语表达。
Sci Rep. 2024 May 8;14(1):10607. doi: 10.1038/s41598-024-60980-0.

本文引用的文献

1
The effect of observation angles on facial age perceptions: A case study of Japanese women.观察角度对面部年龄感知的影响:以日本女性为例的案例研究。
PLoS One. 2022 Dec 27;17(12):e0279339. doi: 10.1371/journal.pone.0279339. eCollection 2022.
2
The future of human behaviour research.人类行为研究的未来。
Nat Hum Behav. 2022 Jan;6(1):15-24. doi: 10.1038/s41562-021-01275-6.
3
Cross-Domain Facial Expression Recognition: A Unified Evaluation Benchmark and Adversarial Graph Learning.跨领域面部表情识别:统一的评估基准与对抗图学习。
IEEE Trans Pattern Anal Mach Intell. 2022 Dec;44(12):9887-9903. doi: 10.1109/TPAMI.2021.3131222. Epub 2022 Nov 7.
4
Do emotions result in their predicted facial expressions? A meta-analysis of studies on the co-occurrence of expression and emotion.情绪是否会导致其预测的面部表情?关于表情和情绪同时发生的研究的元分析。
Emotion. 2021 Oct;21(7):1550-1569. doi: 10.1037/emo0001015. Epub 2021 Nov 15.
5
Do my emotions show or not? Problems with transparency estimation in women with borderline personality disorder features.我的情绪是否表现出来了?边缘型人格特质障碍女性的透明度估计问题。
Personal Disord. 2022 May;13(3):288-299. doi: 10.1037/per0000504. Epub 2021 Oct 21.
6
The emotion-facial expression link: evidence from human and automatic expression recognition.情绪与面部表情的联系:来自人类及自动表情识别的证据。
Psychol Res. 2021 Nov;85(8):2954-2969. doi: 10.1007/s00426-020-01448-4. Epub 2020 Nov 24.
7
A performance comparison of eight commercially available automatic classifiers for facial affect recognition.八种市售面部情感识别自动分类器的性能比较。
PLoS One. 2020 Apr 24;15(4):e0231968. doi: 10.1371/journal.pone.0231968. eCollection 2020.
8
Assessing the Effectiveness of Automated Emotion Recognition in Adults and Children for Clinical Investigation.评估成人和儿童自动情绪识别在临床研究中的有效性。
Front Hum Neurosci. 2020 Apr 7;14:70. doi: 10.3389/fnhum.2020.00070. eCollection 2020.
9
Assessing the convergent validity between the automated emotion recognition software Noldus FaceReader 7 and Facial Action Coding System Scoring.评估自动化情感识别软件 Noldus FaceReader 7 与面部动作编码系统评分之间的收敛效度。
PLoS One. 2019 Oct 17;14(10):e0223905. doi: 10.1371/journal.pone.0223905. eCollection 2019.
10
Facial Affect and Interpersonal Affiliation: Displays of Emotion During Relationship Formation in Social Anxiety Disorder.面部表情与人际依恋:社交焦虑障碍患者在关系形成过程中的情绪表现
Clin Psychol Sci. 2019 Jul;7(4):826-839. doi: 10.1177/2167702619825857. Epub 2019 Mar 12.