Suppr超能文献

注视说话者的眼睛可以提供足够的视觉信息来调节早期听觉处理。

Fixating the eyes of a speaker provides sufficient visual information to modulate early auditory processing.

机构信息

Department of Psychological and Brain Sciences, University of Massachusetts, 135 Hicks Way, Amherst, MA, 01003, USA.

Department of Psychological and Brain Sciences, University of Massachusetts, 135 Hicks Way, Amherst, MA, 01003, USA.

出版信息

Biol Psychol. 2019 Sep;146:107724. doi: 10.1016/j.biopsycho.2019.107724. Epub 2019 Jul 16.

Abstract

In face-to-face conversations, when listeners process and combine information obtained from hearing and seeing a speaker, they mostly look at the eyes rather than at the more informative mouth region. Measuring event-related potentials, we tested whether fixating the speaker's eyes is sufficient for gathering enough visual speech information to modulate early auditory processing, or whether covert attention to the speaker's mouth is needed. Results showed that when listeners fixated the eye region of the speaker, the amplitudes of the auditory evoked N1 and P2 were reduced when listeners heard and saw the speaker than when they only heard her. These cross-modal interactions also occurred when, in addition, attention was restricted to the speaker's eye region. Fixating the speaker's eyes thus provides listeners with sufficient visual information to facilitate early auditory processing. The spread of covert attention to the mouth area is not needed to observe audiovisual interactions.

摘要

在面对面的对话中,当听者处理和组合从听到和看到说话者那里获得的信息时,他们大多会注视眼睛,而不是更具信息量的嘴部区域。通过测量事件相关电位,我们测试了注视说话者的眼睛是否足以获取足够的视觉言语信息来调节早期听觉处理,还是需要将注意力隐藏在说话者的嘴部。结果表明,当听者注视说话者的眼部区域时,当听者听到并看到说话者时,他们的听觉诱发 N1 和 P2 的幅度比仅听到说话者时要小。当注意力也仅限于说话者的眼部区域时,这些跨模态相互作用也会发生。因此,注视说话者的眼睛为听者提供了足够的视觉信息,从而有助于早期的听觉处理。将注意力隐藏地转移到嘴部区域不需要观察视听交互。

相似文献

1
Fixating the eyes of a speaker provides sufficient visual information to modulate early auditory processing.
Biol Psychol. 2019 Sep;146:107724. doi: 10.1016/j.biopsycho.2019.107724. Epub 2019 Jul 16.
2
Does dynamic information about the speaker's face contribute to semantic speech processing? ERP evidence.
Cortex. 2018 Jul;104:12-25. doi: 10.1016/j.cortex.2018.03.031. Epub 2018 Apr 9.
4
Congruent Visual Speech Enhances Cortical Entrainment to Continuous Auditory Speech in Noise-Free Conditions.
J Neurosci. 2015 Oct 21;35(42):14195-204. doi: 10.1523/JNEUROSCI.1829-15.2015.
5
On how the brain decodes vocal cues about speaker confidence.
Cortex. 2015 May;66:9-34. doi: 10.1016/j.cortex.2015.02.002. Epub 2015 Feb 21.
6
Competing influence of visual speech on auditory neural adaptation.
Brain Lang. 2023 Dec;247:105359. doi: 10.1016/j.bandl.2023.105359. Epub 2023 Nov 9.
7
When eyes beat lips: speaker gaze affects audiovisual integration in the McGurk illusion.
Psychol Res. 2022 Sep;86(6):1930-1943. doi: 10.1007/s00426-021-01618-y. Epub 2021 Dec 2.
8
Does the speaker's eye gaze facilitate infants' word segmentation from continuous speech? An ERP study.
Dev Sci. 2024 Mar;27(2):e13436. doi: 10.1111/desc.13436. Epub 2023 Aug 8.
9
Gaze aversion to stuttered speech: a pilot study investigating differential visual attention to stuttered and fluent speech.
Int J Lang Commun Disord. 2010 Mar-Apr;45(2):133-44. doi: 10.3109/13682820902763951.
10
Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech.
J Speech Lang Hear Res. 2021 Sep 14;64(9):3432-3445. doi: 10.1044/2021_JSLHR-21-00106. Epub 2021 Aug 31.

引用本文的文献

1
Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech.
J Neurosci. 2020 Jan 29;40(5):1053-1065. doi: 10.1523/JNEUROSCI.1101-19.2019. Epub 2019 Dec 30.

本文引用的文献

1
Electrophysiological evidence for differences between fusion and combination illusions in audiovisual speech perception.
Eur J Neurosci. 2017 Nov;46(10):2578-2583. doi: 10.1111/ejn.13734. Epub 2017 Nov 6.
2
Neural mechanisms of eye contact when listening to another person talking.
Soc Cogn Affect Neurosci. 2017 Feb 1;12(2):319-328. doi: 10.1093/scan/nsw127.
3
Spatial Frequency Requirements and Gaze Strategy in Visual-Only and Audiovisual Speech Perception.
J Speech Lang Hear Res. 2016 Aug 1;59(4):601-15. doi: 10.1044/2016_JSLHR-S-15-0092.
4
Quantifying lip-read-induced suppression and facilitation of the auditory N1 and P2 reveals peak enhancements and delays.
Psychophysiology. 2016 Sep;53(9):1295-306. doi: 10.1111/psyp.12683. Epub 2016 Jun 13.
5
Using EEG and stimulus context to probe the modelling of auditory-visual speech.
Cortex. 2016 Feb;75:220-230. doi: 10.1016/j.cortex.2015.03.010. Epub 2015 Apr 17.
6
ERPLAB: an open-source toolbox for the analysis of event-related potentials.
Front Hum Neurosci. 2014 Apr 14;8:213. doi: 10.3389/fnhum.2014.00213. eCollection 2014.
7
Electrophysiological evidence for speech-specific audiovisual integration.
Neuropsychologia. 2014 Jan;53:115-21. doi: 10.1016/j.neuropsychologia.2013.11.011. Epub 2013 Nov 27.
8
The attentional requirements of consciousness.
Trends Cogn Sci. 2012 Aug;16(8):411-7. doi: 10.1016/j.tics.2012.06.013. Epub 2012 Jul 12.
10
Natural-scene perception requires attention.
Psychol Sci. 2011 Sep;22(9):1165-72. doi: 10.1177/0956797611419168. Epub 2011 Aug 12.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验