Suppr超能文献

非面部表情:情感的面部表情语法化。

The not face: A grammaticalization of facial expressions of emotion.

机构信息

The Ohio State University, Columbus, OH, United States.

Purdue University, West Lafayette, IN, United States.

出版信息

Cognition. 2016 May;150:77-84. doi: 10.1016/j.cognition.2016.02.004. Epub 2016 Feb 9.

Abstract

Facial expressions of emotion are thought to have evolved from the development of facial muscles used in sensory regulation and later adapted to express moral judgment. Negative moral judgment includes the expressions of anger, disgust and contempt. Here, we study the hypothesis that these facial expressions of negative moral judgment have further evolved into a facial expression of negation regularly used as a grammatical marker in human language. Specifically, we show that people from different cultures expressing negation use the same facial muscles as those employed to express negative moral judgment. We then show that this nonverbal signal is used as a co-articulator in speech and that, in American Sign Language, it has been grammaticalized as a non-manual marker. Furthermore, this facial expression of negation exhibits the theta oscillation (3-8 Hz) universally seen in syllable and mouthing production in speech and signing. These results provide evidence for the hypothesis that some components of human language have evolved from facial expressions of emotion, and suggest an evolutionary route for the emergence of grammatical markers.

摘要

人们认为,情感的面部表情是从用于感官调节的面部肌肉的发展演变而来的,后来又适应于表达道德判断。负面道德判断包括愤怒、厌恶和轻蔑的表情。在这里,我们研究了这样一个假设,即这些负面道德判断的面部表情进一步演变成了一种在人类语言中经常用作语法标记的否定表情。具体来说,我们表明,来自不同文化的表达否定的人使用的面部肌肉与用于表达负面道德判断的肌肉相同。然后,我们表明,这种非言语信号在言语中被用作协同发音器,并且在美式手语中,它已被语法化为非手动标记。此外,这种否定的面部表情表现出普遍存在于言语和手语发音中的θ振荡(3-8 Hz)。这些结果为某些人类语言成分是从情感的面部表情演变而来的假设提供了证据,并为语法标记的出现提供了一种进化途径。

相似文献

1
The not face: A grammaticalization of facial expressions of emotion.
Cognition. 2016 May;150:77-84. doi: 10.1016/j.cognition.2016.02.004. Epub 2016 Feb 9.
3
Effects of affective and emotional congruency on facial expression processing under different task demands.
Acta Psychol (Amst). 2018 Jun;187:66-76. doi: 10.1016/j.actpsy.2018.04.013. Epub 2018 May 8.
4
Sex differences in neural activation to facial expressions denoting contempt and disgust.
PLoS One. 2008;3(11):e3622. doi: 10.1371/journal.pone.0003622. Epub 2008 Nov 5.
5
Judgments of subtle facial expressions of emotion.
Emotion. 2014 Apr;14(2):349-57. doi: 10.1037/a0035237.
6
Understanding 'not': neuropsychological dissociations between hand and head markers of negation in BSL.
Neuropsychologia. 2004;42(2):214-29. doi: 10.1016/s0028-3932(03)00186-6.
7
Categorical perception of emotional facial expressions does not require lexical categories.
Emotion. 2011 Dec;11(6):1479-83. doi: 10.1037/a0025336. Epub 2011 Oct 17.
8
Many moral buttons or just one? Evidence from emotional facial expressions.
Cogn Emot. 2019 Aug;33(5):943-958. doi: 10.1080/02699931.2018.1520078. Epub 2018 Sep 11.
9
Cross-emotion facial expression aftereffects.
Vision Res. 2011 Sep 1;51(17):1889-96. doi: 10.1016/j.visres.2011.06.017. Epub 2011 Jul 8.
10
The word disgust may refer to more than one emotion.
Emotion. 2016 Apr;16(3):301-308. doi: 10.1037/emo0000118. Epub 2015 Nov 23.

引用本文的文献

1
From Seed to System: The Emergence of Non-Manual Markers for Wh-Questions in Nicaraguan Sign Language.
Languages (Basel). 2022 Jun;7(2). doi: 10.3390/languages7020137. Epub 2022 May 30.
2
Evidence for compositional abilities in one-year-old infants.
Commun Psychol. 2025 Mar 10;3(1):37. doi: 10.1038/s44271-025-00222-9.
3
An android can show the facial expressions of complex emotions.
Sci Rep. 2025 Jan 19;15(1):2433. doi: 10.1038/s41598-024-84224-3.
4
Facial signals shape predictions about the nature of upcoming conversational responses.
Sci Rep. 2025 Jan 9;15(1):1381. doi: 10.1038/s41598-025-85192-y.
5
Cross-Linguistic Recognition of Irony Through Visual and Acoustic Cues.
J Psycholinguist Res. 2024 Nov 15;53(6):73. doi: 10.1007/s10936-024-10111-7.
6
Semantic and syntactic processing of emojis in sentential intermediate positions.
Cogn Neurodyn. 2024 Aug;18(4):1743-1752. doi: 10.1007/s11571-023-10037-1. Epub 2023 Dec 13.
7
Specific facial signals associate with categories of social actions conveyed through questions.
PLoS One. 2023 Jul 19;18(7):e0288104. doi: 10.1371/journal.pone.0288104. eCollection 2023.
8
Clients' Facial Expressions of Self-Compassion, Self-Criticism, and Self-Protection in Emotion-Focused Therapy Videos.
Int J Environ Res Public Health. 2023 Jan 9;20(2):1129. doi: 10.3390/ijerph20021129.
9
Interactionally Embedded Gestalt Principles of Multimodal Human Communication.
Perspect Psychol Sci. 2023 Sep;18(5):1136-1159. doi: 10.1177/17456916221141422. Epub 2023 Jan 12.
10
Visual bodily signals as core devices for coordinating minds in interaction.
Philos Trans R Soc Lond B Biol Sci. 2022 Sep 12;377(1859):20210094. doi: 10.1098/rstb.2021.0094. Epub 2022 Jul 25.

本文引用的文献

1
A Neural Basis of Facial Action Recognition in Humans.
J Neurosci. 2016 Apr 20;36(16):4434-42. doi: 10.1523/JNEUROSCI.1704-15.2016.
2
Simultaneous perception of a spoken and a signed language: The brain basis of ASL-English code-blends.
Brain Lang. 2015 Aug;147:96-106. doi: 10.1016/j.bandl.2015.05.006. Epub 2015 Jul 10.
3
Heterogeneity of long-history migration explains cultural differences in reports of emotional expressivity and the functions of smiles.
Proc Natl Acad Sci U S A. 2015 May 12;112(19):E2429-36. doi: 10.1073/pnas.1413661112. Epub 2015 Apr 20.
4
Spontaneous facial expression in unscripted social interactions can be measured automatically.
Behav Res Methods. 2015 Dec;47(4):1136-1147. doi: 10.3758/s13428-014-0536-1.
5
How could language have evolved?
PLoS Biol. 2014 Aug 26;12(8):e1001934. doi: 10.1371/journal.pbio.1001934. eCollection 2014 Aug.
6
Compound facial expressions of emotion.
Proc Natl Acad Sci U S A. 2014 Apr 15;111(15):E1454-62. doi: 10.1073/pnas.1322355111. Epub 2014 Mar 31.
7
Automatic decoding of facial movements reveals deceptive pain expressions.
Curr Biol. 2014 Mar 31;24(7):738-43. doi: 10.1016/j.cub.2014.02.009. Epub 2014 Mar 20.
8
Discriminant features and temporal structure of nonmanuals in American Sign Language.
PLoS One. 2014 Feb 6;9(2):e86268. doi: 10.1371/journal.pone.0086268. eCollection 2014.
9
Speech rhythms and multiplexed oscillatory sensory coding in the human brain.
PLoS Biol. 2013 Dec;11(12):e1001752. doi: 10.1371/journal.pbio.1001752. Epub 2013 Dec 31.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验