• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

表达情感的虚拟人脸:一项初步伴随和构念效度研究。

Virtual faces expressing emotions: an initial concomitant and construct validity study.

机构信息

Department of Psychology, University of Quebec at Trois-Rivières , Trois-Rivières, QC , Canada ; Research Center, Philippe-Pinel Institute of Montreal , Montreal, QC , Canada.

Department of Psychology, University of Quebec at Trois-Rivières , Trois-Rivières, QC , Canada.

出版信息

Front Hum Neurosci. 2014 Sep 30;8:787. doi: 10.3389/fnhum.2014.00787. eCollection 2014.

DOI:10.3389/fnhum.2014.00787
PMID:25324768
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC4179743/
Abstract

BACKGROUND

Facial expressions of emotions represent classic stimuli for the study of social cognition. Developing virtual dynamic facial expressions of emotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human-Computer retroactions between physiological measures and the virtual agent.

OBJECTIVES

The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions.

RESULTS

Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants.

CONCLUSION

Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain-Computer Interface studies with feedback-feedforward interactions based on facial emotion expressions can also be conducted with these stimuli.

摘要

背景

面部表情是研究社会认知的经典刺激物。然而,开发虚拟动态面部表情将为基础研究和临床研究开辟可能性。例如,虚拟面孔允许生理测量值与虚拟代理之间进行实时人机反向反应。

目的

本研究的目的是初步评估一套新开发的表达六种基本情绪(快乐、惊讶、愤怒、悲伤、恐惧和厌恶)的虚拟面孔的伴随和结构效度。在呈现描绘真实和虚拟成年人表达情绪的视频剪辑时,比较了 41 名成年志愿者(20 名男性,21 名女性)的识别率、面部肌电图(颧大肌和皱眉肌)和区域注视潜伏期(眼睛和嘴巴区域)。

结果

男性和女性对每组刺激物表达的情绪都有类似的识别。因此,两组刺激物都引起了男性和女性参与者眼部区域相似的面部肌肉激活和相似的眼球注视时间。

结论

可以在已知存在社会认知困难的临床人群中对这些虚拟面孔进行进一步的验证研究。也可以使用这些刺激物进行基于面部表情的反馈-前馈交互的脑-机接口研究。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d63/4179743/9bfcf3c90e4f/fnhum-08-00787-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d63/4179743/79a158975964/fnhum-08-00787-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d63/4179743/e2e8ccba07e3/fnhum-08-00787-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d63/4179743/9bfcf3c90e4f/fnhum-08-00787-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d63/4179743/79a158975964/fnhum-08-00787-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d63/4179743/e2e8ccba07e3/fnhum-08-00787-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d63/4179743/9bfcf3c90e4f/fnhum-08-00787-g003.jpg

相似文献

1
Virtual faces expressing emotions: an initial concomitant and construct validity study.表达情感的虚拟人脸:一项初步伴随和构念效度研究。
Front Hum Neurosci. 2014 Sep 30;8:787. doi: 10.3389/fnhum.2014.00787. eCollection 2014.
2
Rapid Facial Reactions in Response to Facial Expressions of Emotion Displayed by Real Versus Virtual Faces.对真实面孔与虚拟面孔所展现的情感面部表情做出的快速面部反应。
Iperception. 2018 Jul 12;9(4):2041669518786527. doi: 10.1177/2041669518786527. eCollection 2018 Jul-Aug.
3
Recognition profile of emotions in natural and virtual faces.自然面孔和虚拟面孔中情绪的识别特征
PLoS One. 2008;3(11):e3628. doi: 10.1371/journal.pone.0003628. Epub 2008 Nov 5.
4
An exploratory study of the effect of age and gender on face scanning during affect recognition in immersive virtual reality.年龄和性别对沉浸式虚拟现实中情感识别过程中对面部扫描的影响的探索性研究。
Sci Rep. 2024 Mar 6;14(1):5553. doi: 10.1038/s41598-024-55774-3.
5
Performance-driven facial animation: basic research on human judgments of emotional state in facial avatars.基于表现的面部动画:关于人类对面部虚拟形象情绪状态判断的基础研究。
Cyberpsychol Behav. 2001 Aug;4(4):471-87. doi: 10.1089/109493101750527033.
6
Validation of the Amsterdam Dynamic Facial Expression Set--Bath Intensity Variations (ADFES-BIV): A Set of Videos Expressing Low, Intermediate, and High Intensity Emotions.阿姆斯特丹动态面部表情集 - 巴斯强度变化版(ADFES - BIV)的验证:一组表达低、中、高强度情绪的视频
PLoS One. 2016 Jan 19;11(1):e0147112. doi: 10.1371/journal.pone.0147112. eCollection 2016.
7
Creation and validation of the Picture-Set of Young Children's Affective Facial Expressions (PSYCAFE).创建和验证《婴幼儿情感面部表情图集》(PSYCAFE)。
PLoS One. 2021 Dec 7;16(12):e0260871. doi: 10.1371/journal.pone.0260871. eCollection 2021.
8
Not on the face alone: perception of contextualized face expressions in Huntington's disease.不仅仅局限于面部:亨廷顿舞蹈症中情境化面部表情的感知
Brain. 2009 Jun;132(Pt 6):1633-44. doi: 10.1093/brain/awp067. Epub 2009 May 18.
9
Self-relevance appraisal of gaze direction and dynamic facial expressions: effects on facial electromyographic and autonomic reactions.注视方向和动态面部表情的自我相关性评估:对面部肌电图和自主反应的影响。
Emotion. 2013 Apr;13(2):330-7. doi: 10.1037/a0029892. Epub 2012 Sep 17.
10
Younger and Older Users' Recognition of Virtual Agent Facial Expressions.年轻和年长用户对虚拟代理面部表情的识别。
Int J Hum Comput Stud. 2015 Mar 1;75:1-20. doi: 10.1016/j.ijhcs.2014.11.005.

引用本文的文献

1
Regulation of interpersonal distance in virtual reality: Implications for socio-emotional functioning in late adulthood.虚拟现实中人际距离的调节:对成年晚期社会情感功能的影响。
PLoS One. 2025 May 8;20(5):e0323182. doi: 10.1371/journal.pone.0323182. eCollection 2025.
2
An exploratory study of the effect of age and gender on face scanning during affect recognition in immersive virtual reality.年龄和性别对沉浸式虚拟现实中情感识别过程中对面部扫描的影响的探索性研究。
Sci Rep. 2024 Mar 6;14(1):5553. doi: 10.1038/s41598-024-55774-3.
3
Trajectories of Emotion Recognition Training in Virtual Reality and Predictors of Improvement for People with a Psychotic Disorder.

本文引用的文献

1
Prototypicality and intensity of emotional faces using an anchor-point method.使用锚定点法的情绪面孔典型性和强度。
Span J Psychol. 2013;16:E7. doi: 10.1017/sjp.2013.9.
2
The Umeå University Database of Facial Expressions: a validation study.乌梅奥大学面部表情数据库:一项验证研究。
J Med Internet Res. 2012 Oct 9;14(5):e136. doi: 10.2196/jmir.2196.
3
FACSGen 2.0 animation software: generating three-dimensional FACS-valid facial expressions for emotion research.FACSGen 2.0 动画软件:为情感研究生成三维 FACS 有效面部表情。
虚拟现实中的情绪识别训练轨迹及其对精神病患者改善的预测因素。
Cyberpsychol Behav Soc Netw. 2023 Apr;26(4):288-299. doi: 10.1089/cyber.2022.0228.
4
Virtual Reality-Assisted Awake Craniotomy: A Retrospective Study.虚拟现实辅助清醒开颅手术:一项回顾性研究。
Cancers (Basel). 2023 Feb 2;15(3):949. doi: 10.3390/cancers15030949.
5
Validation of the Tunisian Test for Facial Emotions Recognition: Study in Children From 7 to 12 Years Old.突尼斯面部表情识别测试的验证:针对7至12岁儿童的研究。
Front Psychol. 2021 Nov 22;12:643749. doi: 10.3389/fpsyg.2021.643749. eCollection 2021.
6
Facial Affect Recognition by Patients with Schizophrenia Using Human Avatars.精神分裂症患者通过人类化身进行面部表情识别。
J Clin Med. 2021 Apr 28;10(9):1904. doi: 10.3390/jcm10091904.
7
Validation of dynamic virtual faces for facial affect recognition.用于面部表情识别的动态虚拟面部的验证
PLoS One. 2021 Jan 25;16(1):e0246001. doi: 10.1371/journal.pone.0246001. eCollection 2021.
8
Virtual eye region: development of a realistic model to convey emotion.虚拟眼区域:用于传达情感的逼真模型的开发
Heliyon. 2019 Dec 7;5(12):e02778. doi: 10.1016/j.heliyon.2019.e02778. eCollection 2019 Dec.
9
Identification of muscle fatigue by tracking facial expressions.通过跟踪面部表情识别肌肉疲劳。
PLoS One. 2018 Dec 18;13(12):e0208834. doi: 10.1371/journal.pone.0208834. eCollection 2018.
10
Suppression of Sensorimotor Alpha Power Associated With Pain Expressed by an Avatar: A Preliminary EEG Study.与虚拟化身所表达疼痛相关的感觉运动α波功率抑制:一项初步脑电图研究。
Front Hum Neurosci. 2018 Jul 9;12:273. doi: 10.3389/fnhum.2018.00273. eCollection 2018.
Emotion. 2012 Apr;12(2):351-63. doi: 10.1037/a0026632. Epub 2012 Jan 16.
4
Introducing the Geneva Multimodal expression corpus for experimental research on emotion perception.介绍用于情绪感知实验研究的日内瓦多模态表达语料库。
Emotion. 2012 Oct;12(5):1161-79. doi: 10.1037/a0025827. Epub 2011 Nov 14.
5
Moving faces, looking places: validation of the Amsterdam Dynamic Facial Expression Set (ADFES).移动物体,观察位置:阿姆斯特丹动态面部表情集(ADFES)的验证。
Emotion. 2011 Aug;11(4):907-20. doi: 10.1037/a0023853.
6
Real-time functional magnetic imaging-brain-computer interface and virtual reality promising tools for the treatment of pedophilia.实时功能磁共振成像-脑机接口和虚拟现实有望成为治疗恋童癖的工具。
Prog Brain Res. 2011;192:263-72. doi: 10.1016/B978-0-444-53355-5.00014-2.
7
Can you feel what you do not see? Using internal feedback to detect briefly presented emotional stimuli.你能感受到你看不见的东西吗?利用内部反馈来检测短暂呈现的情绪刺激。
Int J Psychophysiol. 2012 Jul;85(1):116-24. doi: 10.1016/j.ijpsycho.2011.04.007. Epub 2011 May 13.
8
EMG activity in response to static and dynamic facial expressions.针对静态和动态面部表情的肌电图活动。
Int J Psychophysiol. 2011 Feb;79(2):330-3. doi: 10.1016/j.ijpsycho.2010.11.001. Epub 2010 Nov 11.
9
Virtual faces as a tool to study emotion recognition deficits in schizophrenia.虚拟人脸作为研究精神分裂症情绪识别缺陷的工具。
Psychiatry Res. 2010 Oct 30;179(3):247-52. doi: 10.1016/j.psychres.2009.11.004. Epub 2010 May 18.
10
Development of a FACS-verified set of basic and self-conscious emotion expressions.一套经荧光激活细胞分选术(FACS)验证的基本情绪和自我意识情绪表达的开发。
Emotion. 2009 Aug;9(4):554-9. doi: 10.1037/a0015766.