• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

从空间和时间上经过修改的动作中识别面部表情。

Recognising facial expression from spatially and temporally modified movements.

作者信息

Pollick Frank E, Hill Harold, Calder Andrew, Paterson Helena

机构信息

Department of Psychology, University of Glasgow, 58 Hillhead Street, Glasgow G12 8QB, Scotland, UK.

出版信息

Perception. 2003;32(7):813-26. doi: 10.1068/p3319.

DOI:10.1068/p3319
PMID:12974567
Abstract

We examined how the recognition of facial emotion was influenced by manipulation of both spatial and temporal properties of 3-D point-light displays of facial motion. We started with the measurement of 3-D position of multiple locations on the face during posed expressions of anger, happiness, sadness, and surprise, and then manipulated the spatial and temporal properties of the measurements to obtain new versions of the movements. In two experiments, we examined recognition of these original and modified facial expressions: in experiment 1, we manipulated the spatial properties of the facial movement, and in experiment 2 we manipulated the temporal properties. The results of experiment 1 showed that exaggeration of facial expressions relative to a fixed neutral expression resulted in enhanced ratings of the intensity of that emotion. The results of experiment 2 showed that changing the duration of an expression had a small effect on ratings of emotional intensity, with a trend for expressions with shorter durations to have lower ratings of intensity. The results are discussed within the context of theories of encoding as related to caricature and emotion.

摘要

我们研究了面部运动的三维点光显示的空间和时间属性的操纵如何影响面部情绪的识别。我们首先测量了在愤怒、快乐、悲伤和惊讶的表情中面部多个位置的三维位置,然后操纵这些测量的空间和时间属性以获得运动的新版本。在两个实验中,我们研究了对这些原始和修改后的面部表情的识别:在实验1中,我们操纵了面部运动的空间属性,在实验2中我们操纵了时间属性。实验1的结果表明,相对于固定的中性表情对面部表情进行夸张会导致该情绪强度评分的提高。实验2的结果表明,改变表情的持续时间对情绪强度评分有较小影响,持续时间较短的表情有强度评分较低的趋势。在与漫画和情绪相关的编码理论背景下对结果进行了讨论。

相似文献

1
Recognising facial expression from spatially and temporally modified movements.从空间和时间上经过修改的动作中识别面部表情。
Perception. 2003;32(7):813-26. doi: 10.1068/p3319.
2
Caricaturing facial expressions.
Cognition. 2000 Aug 14;76(2):105-46. doi: 10.1016/s0010-0277(00)00074-3.
3
Emotion perception from dynamic and static body expressions in point-light and full-light displays.从点光源和全光源显示中的动态与静态身体表达来感知情绪。
Perception. 2004;33(6):717-46. doi: 10.1068/p5096.
4
Mapping correspondence between facial mimicry and emotion recognition in healthy subjects.在健康受试者中对面部模仿和情绪识别的对应关系进行映射。
Emotion. 2012 Dec;12(6):1398-403. doi: 10.1037/a0028588. Epub 2012 May 28.
5
Computer-enhanced emotion in facial expressions.面部表情中计算机增强的情感。
Proc Biol Sci. 1997 Jun 22;264(1383):919-25. doi: 10.1098/rspb.1997.0127.
6
Enhancing images of facial expressions.增强面部表情图像。
Percept Psychophys. 1999 Feb;61(2):259-74. doi: 10.3758/bf03206887.
7
Recognizing dynamic facial expressions of emotion: Specificity and intensity effects in event-related brain potentials.识别动态面部表情的情绪:事件相关脑电位中的特异性和强度效应。
Biol Psychol. 2014 Feb;96:111-25. doi: 10.1016/j.biopsycho.2013.12.003. Epub 2013 Dec 19.
8
Older adults' recognition of bodily and auditory expressions of emotion.老年人对身体和听觉情感表达的识别。
Psychol Aging. 2009 Sep;24(3):614-22. doi: 10.1037/a0016356.
9
The contribution of different cues of facial movement to the emotional facial expression adaptation aftereffect.面部运动的不同线索对情绪性面部表情适应后效的作用。
J Vis. 2013 Jan 18;13(1):23. doi: 10.1167/13.1.23.
10
Effects of age and emotional intensity on the recognition of facial emotion.年龄和情绪强度对面部表情识别的影响。
Exp Aging Res. 2008 Jan-Mar;34(1):63-79. doi: 10.1080/03610730701762047.

引用本文的文献

1
The Jena Audiovisual Stimuli of Morphed Emotional Pseudospeech (JAVMEPS): A database for emotional auditory-only, visual-only, and congruent and incongruent audiovisual voice and dynamic face stimuli with varying voice intensities.《耶拿情感伪语音变声视听刺激库(JAVMEPS)》:一个包含情感纯听觉、纯视觉以及与声音强度变化的语音和动态面部相匹配和不匹配的视听声音刺激的数据库。
Behav Res Methods. 2024 Aug;56(5):5103-5115. doi: 10.3758/s13428-023-02249-4. Epub 2023 Oct 11.
2
Development and validation of a natural dynamic facial expression stimulus set.自然动态面部表情刺激集的开发与验证。
PLoS One. 2023 Jun 28;18(6):e0287049. doi: 10.1371/journal.pone.0287049. eCollection 2023.
3
Ties between reading faces, bodies, eyes, and autistic traits.
解读面部、身体、眼睛与自闭症特质之间的关联。
Front Neurosci. 2022 Sep 28;16:997263. doi: 10.3389/fnins.2022.997263. eCollection 2022.
4
Caricatured facial movements enhance perception of emotional facial expressions.夸张的面部动作增强了对情绪化面部表情的感知。
Perception. 2022 May;51(5):313-343. doi: 10.1177/03010066221086452. Epub 2022 Mar 28.
5
Recognizing emotions in bodies: Vagus nerve stimulation enhances recognition of anger while impairing sadness.识别身体中的情绪:迷走神经刺激增强了对愤怒的识别,同时损害了对悲伤的识别。
Cogn Affect Behav Neurosci. 2021 Dec;21(6):1246-1261. doi: 10.3758/s13415-021-00928-3. Epub 2021 Jul 15.
6
The role of movement kinematics in facial emotion expression production and recognition.运动运动学在面部表情表达和识别中的作用。
Emotion. 2021 Aug;21(5):1041-1061. doi: 10.1037/emo0000835. Epub 2021 Mar 4.
7
Recognition of Emotions From Facial Point-Light Displays.从面部点光源显示中识别情绪
Front Psychol. 2020 Jun 4;11:1062. doi: 10.3389/fpsyg.2020.01062. eCollection 2020.
8
Informing, Coordinating, and Performing: A Perspective on Functions of Sensorimotor Communication.告知、协调与执行:关于感觉运动沟通功能的一种观点
Front Hum Neurosci. 2020 May 26;14:168. doi: 10.3389/fnhum.2020.00168. eCollection 2020.
9
Selective eye fixations on diagnostic face regions of dynamic emotional expressions: KDEF-dyn database.选择性注视动态情感表达的诊断性面部区域:KDEF-dyn 数据库。
Sci Rep. 2018 Nov 19;8(1):17039. doi: 10.1038/s41598-018-35259-w.
10
Human Observers and Automated Assessment of Dynamic Emotional Facial Expressions: KDEF-dyn Database Validation.人类观察者与动态情绪面部表情的自动评估:KDEF-dyn数据库验证
Front Psychol. 2018 Oct 26;9:2052. doi: 10.3389/fpsyg.2018.02052. eCollection 2018.