Suppr超能文献

用于性别和表情识别的最佳面部:一种测量面部感知中使用的动态模板的新技术。

Optimal faces for gender and expression: a new technique for measuring dynamic templates used in face perception.

作者信息

Poirier Frédéric J A M, Faubert Jocelyn

机构信息

Visual Psychophysics and Perception Laboratory, School of Optometry, Montréal, Québec, Canada.

出版信息

J Vis. 2012 Jun 22;12(6):28. doi: 10.1167/12.6.28.

Abstract

Facial expressions are important for human communications. Face perception studies often measure the impact of major degradation (e.g., noise, inversion, short presentations, masking, alterations) on natural expression recognition performance. Here, we introduce a novel face perception technique using rich and undegraded stimuli. Participants modified faces to create optimal representations of given expressions. Using sliders, participants adjusted 53 face components (including 37 dynamic) including head, eye, eyebrows, mouth, and nose shape and position. Data was collected from six participants and 10 conditions (six emotions + pain + gender + neutral). Some expressions had unique features (e.g., frown for anger, upward-curved mouth for happiness), whereas others had shared features (e.g., open eyes and mouth for surprise and fear). Happiness was different from other emotions. Surprise was different from other emotions except fear. Weighted sum morphing provides acceptable stimuli for gender-neutral and dynamic stimuli. Many features were correlated, including (1) head size with internal feature sizes as related to gender, (2) internal feature scaling, and (3) eyebrow height and eye openness as related to surprise and fear. These findings demonstrate the method's validity for measuring the optimal facial expressions, which we argue is a more direct measure of their internal representations.

摘要

面部表情对人类交流至关重要。面部感知研究通常会测量主要退化因素(如噪声、倒置、短暂呈现、掩蔽、改变)对自然表情识别性能的影响。在此,我们介绍一种使用丰富且未退化刺激的新型面部感知技术。参与者对面部进行修改,以创建给定表情的最佳表征。通过滑块,参与者调整53个面部组件(包括37个动态组件),涵盖头部、眼睛、眉毛、嘴巴以及鼻子的形状和位置。数据收集自6名参与者以及10种条件(六种情绪 + 疼痛 + 性别 + 中性)。有些表情具有独特特征(如愤怒时皱眉、开心时嘴角上扬),而其他表情则具有共同特征(如惊讶和恐惧时眼睛和嘴巴张开)。开心与其他情绪不同。惊讶与除恐惧外的其他情绪不同。加权和变形为中性性别和动态刺激提供了可接受的刺激。许多特征存在相关性,包括:(1)与性别相关的头部大小与内部特征大小,(2)内部特征缩放,以及(3)与惊讶和恐惧相关的眉毛高度和眼睛睁开程度。这些发现证明了该方法在测量最佳面部表情方面的有效性,我们认为这是对其内部表征的更直接测量。

相似文献

2
Cross-emotion facial expression aftereffects.
Vision Res. 2011 Sep 1;51(17):1889-96. doi: 10.1016/j.visres.2011.06.017. Epub 2011 Jul 8.
3
The functional correlates of face perception and recognition of emotional facial expressions as evidenced by fMRI.
Brain Res. 2011 Jun 1;1393:73-83. doi: 10.1016/j.brainres.2011.04.007. Epub 2011 Apr 9.
4
Facial interpretation and component consistency.
Genet Soc Gen Psychol Monogr. 1996 Nov;122(4):389-404.
5
Recognition and discrimination of prototypical dynamic expressions of pain and emotions.
Pain. 2008 Mar;135(1-2):55-64. doi: 10.1016/j.pain.2007.05.008. Epub 2007 Jun 20.
6
The recognition of emotional expression in prosopagnosia: decoding whole and part faces.
J Int Neuropsychol Soc. 2006 Nov;12(6):884-95. doi: 10.1017/S1355617706061066.
7
Face gender and emotion expression: are angry women more like men?
J Vis. 2009 Nov 24;9(12):19.1-8. doi: 10.1167/9.12.19.
8
Recognition profile of emotions in natural and virtual faces.
PLoS One. 2008;3(11):e3628. doi: 10.1371/journal.pone.0003628. Epub 2008 Nov 5.
10
The role of configural information in facial emotion recognition in schizophrenia.
Neuropsychologia. 2006;44(12):2437-44. doi: 10.1016/j.neuropsychologia.2006.04.008. Epub 2006 Jun 23.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验