Suppr超能文献

读唇:通用编码理论与视觉言语感知。

Reading your own lips: common-coding theory and visual speech perception.

机构信息

Department of Otolaryngology, Washington University School of Medicine, Campus Box 8115, 660 South Euclid Avenue, St. Louis, MO 63124, USA.

出版信息

Psychon Bull Rev. 2013 Feb;20(1):115-9. doi: 10.3758/s13423-012-0328-5.

Abstract

Common-coding theory posits that (1) perceiving an action activates the same representations of motor plans that are activated by actually performing that action, and (2) because of individual differences in the ways that actions are performed, observing recordings of one's own previous behavior activates motor plans to an even greater degree than does observing someone else's behavior. We hypothesized that if observing oneself activates motor plans to a greater degree than does observing others, and if these activated plans contribute to perception, then people should be able to lipread silent video clips of their own previous utterances more accurately than they can lipread video clips of other talkers. As predicted, two groups of participants were able to lipread video clips of themselves, recorded more than two weeks earlier, significantly more accurately than video clips of others. These results suggest that visual input activates speech motor activity that links to word representations in the mental lexicon.

摘要

共有编码理论认为

(1)感知动作会激活与实际执行该动作相同的运动计划的代表;(2)由于动作执行方式存在个体差异,观察自己之前行为的记录会比观察他人行为更能激活运动计划。我们假设,如果观察自己比观察他人更能激活运动计划,如果这些激活的计划有助于感知,那么人们应该能够更准确地读懂自己之前的无声视频片段,而不是其他人的说话视频片段。正如预测的那样,两组参与者能够更准确地读懂自己两周多前录制的视频片段,而不是其他人的视频片段。这些结果表明,视觉输入会激活与心理词典中的单词表示相关的言语运动活动。

相似文献

1
Reading your own lips: common-coding theory and visual speech perception.
Psychon Bull Rev. 2013 Feb;20(1):115-9. doi: 10.3758/s13423-012-0328-5.
2
The self-advantage in visual speech processing enhances audiovisual speech recognition in noise.
Psychon Bull Rev. 2015 Aug;22(4):1048-53. doi: 10.3758/s13423-014-0774-3.
3
Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech.
J Neurosci. 2020 Jan 29;40(5):1053-1065. doi: 10.1523/JNEUROSCI.1101-19.2019. Epub 2019 Dec 30.
4
Modalities of memory: is reading lips like hearing voices?
Cognition. 2013 Dec;129(3):471-93. doi: 10.1016/j.cognition.2013.08.017. Epub 2013 Sep 14.
5
Two cortical mechanisms support the integration of visual and auditory speech: a hypothesis and preliminary data.
Neurosci Lett. 2009 Mar 20;452(3):219-23. doi: 10.1016/j.neulet.2009.01.060. Epub 2009 Jan 29.
6
Electrophysiological evidence for Audio-visuo-lingual speech integration.
Neuropsychologia. 2018 Jan 31;109:126-133. doi: 10.1016/j.neuropsychologia.2017.12.024. Epub 2017 Dec 14.
8
Listening to talking faces: motor cortical activation during speech perception.
Neuroimage. 2005 Mar;25(1):76-89. doi: 10.1016/j.neuroimage.2004.11.006. Epub 2005 Jan 8.
9
Lipreading and covert speech production similarly modulate human auditory-cortex responses to pure tones.
J Neurosci. 2010 Jan 27;30(4):1314-21. doi: 10.1523/JNEUROSCI.1950-09.2010.
10
Spoken word recognition by eye.
Scand J Psychol. 2009 Oct;50(5):419-25. doi: 10.1111/j.1467-9450.2009.00751.x.

引用本文的文献

1
Mirrors and toothaches: commonplace manipulations of non-auditory feedback availability change perceived speech intelligibility.
Front Hum Neurosci. 2024 Nov 27;18:1462922. doi: 10.3389/fnhum.2024.1462922. eCollection 2024.
2
Effector-specific motor simulation supplements core action recognition processes in adverse conditions.
Soc Cogn Affect Neurosci. 2023 Oct 13;18(1). doi: 10.1093/scan/nsad046.
3
The own-voice benefit for word recognition in early bilinguals.
Front Psychol. 2022 Sep 2;13:901326. doi: 10.3389/fpsyg.2022.901326. eCollection 2022.
4
Increased Connectivity among Sensory and Motor Regions during Visual and Audiovisual Speech Perception.
J Neurosci. 2022 Jan 19;42(3):435-442. doi: 10.1523/JNEUROSCI.0114-21.2021. Epub 2021 Nov 23.
5
No "Self" Advantage for Audiovisual Speech Aftereffects.
Front Psychol. 2019 Mar 22;10:658. doi: 10.3389/fpsyg.2019.00658. eCollection 2019.
7
Electrophysiological evidence for a self-processing advantage during audiovisual speech integration.
Exp Brain Res. 2017 Sep;235(9):2867-2876. doi: 10.1007/s00221-017-5018-0. Epub 2017 Jul 4.
8
Do We Perceive Others Better than Ourselves? A Perceptual Benefit for Noise-Vocoded Speech Produced by an Average Speaker.
PLoS One. 2015 Jul 2;10(7):e0129731. doi: 10.1371/journal.pone.0129731. eCollection 2015.
9
Prediction and constraint in audiovisual speech perception.
Cortex. 2015 Jul;68:169-81. doi: 10.1016/j.cortex.2015.03.006. Epub 2015 Mar 20.
10
The self-advantage in visual speech processing enhances audiovisual speech recognition in noise.
Psychon Bull Rev. 2015 Aug;22(4):1048-53. doi: 10.3758/s13423-014-0774-3.

本文引用的文献

1
Sensorimotor integration in speech processing: computational basis and neural organization.
Neuron. 2011 Feb 10;69(3):407-22. doi: 10.1016/j.neuron.2011.01.019.
2
The role of mirror neurons in speech and language processing.
Brain Lang. 2010 Jan;112(1):1-2. doi: 10.1016/j.bandl.2009.10.006. Epub 2009 Nov 30.
3
Lexicality drives audio-motor transformations in Broca's area.
Brain Lang. 2010 Jan;112(1):3-11. doi: 10.1016/j.bandl.2009.07.008. Epub 2009 Aug 20.
4
Obligatory Broca's area modulation associated with passive speech perception.
Neuroreport. 2009 Mar 25;20(5):492-6. doi: 10.1097/WNR.0b013e32832940a0.
5
Auditory-visual discourse comprehension by older and young adults in favorable and unfavorable conditions.
Int J Audiol. 2008 Nov;47 Suppl 2(Suppl 2):S31-7. doi: 10.1080/14992020802301662.
6
Motor speech perception modulates the cortical language areas.
Neuroimage. 2008 Jun;41(2):605-13. doi: 10.1016/j.neuroimage.2008.02.046. Epub 2008 Mar 6.
7
Auditory and visual lexical neighborhoods in audiovisual speech perception.
Trends Amplif. 2007 Dec;11(4):233-41. doi: 10.1177/1084713807307409.
8
The English Lexicon Project.
Behav Res Methods. 2007 Aug;39(3):445-59. doi: 10.3758/bf03193014.
9
The motor theory of speech perception reviewed.
Psychon Bull Rev. 2006 Jun;13(3):361-77. doi: 10.3758/bf03193857.
10
Motor cortex maps articulatory features of speech sounds.
Proc Natl Acad Sci U S A. 2006 May 16;103(20):7865-70. doi: 10.1073/pnas.0509989103. Epub 2006 May 8.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验