• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

言语特异性视听整合的电生理学证据。

Electrophysiological evidence for speech-specific audiovisual integration.

机构信息

Basque Center on Cognition, Brain and Language, Paseo Mikeletegi 69, 2nd floor, 20009 Donostia, Spain; Tilburg University, Department of Cognitive Neuropsychology, P.O. Box 90153, Warandelaan 2, 5000 LE, Tilburg, the Netherlands.

Tilburg University, Department of Cognitive Neuropsychology, P.O. Box 90153, Warandelaan 2, 5000 LE, Tilburg, the Netherlands.

出版信息

Neuropsychologia. 2014 Jan;53:115-21. doi: 10.1016/j.neuropsychologia.2013.11.011. Epub 2013 Nov 27.

DOI:10.1016/j.neuropsychologia.2013.11.011
PMID:24291340
Abstract

Lip-read speech is integrated with heard speech at various neural levels. Here, we investigated the extent to which lip-read induced modulations of the auditory N1 and P2 (measured with EEG) are indicative of speech-specific audiovisual integration, and we explored to what extent the ERPs were modulated by phonetic audiovisual congruency. In order to disentangle speech-specific (phonetic) integration from non-speech integration, we used Sine-Wave Speech (SWS) that was perceived as speech by half of the participants (they were in speech-mode), while the other half was in non-speech mode. Results showed that the N1 obtained with audiovisual stimuli peaked earlier than the N1 evoked by auditory-only stimuli. This lip-read induced speeding up of the N1 occurred for listeners in speech and non-speech mode. In contrast, if listeners were in speech-mode, lip-read speech also modulated the auditory P2, but not if listeners were in non-speech mode, thus revealing speech-specific audiovisual binding. Comparing ERPs for phonetically congruent audiovisual stimuli with ERPs for incongruent stimuli revealed an effect of phonetic stimulus congruency that started at ~200 ms after (in)congruence became apparent. Critically, akin to the P2 suppression, congruency effects were only observed if listeners were in speech mode, and not if they were in non-speech mode. Using identical stimuli, we thus confirm that audiovisual binding involves (partially) different neural mechanisms for sound processing in speech and non-speech mode.

摘要

唇读语音与听到的语音在各种神经水平上整合在一起。在这里,我们研究了唇读引起的听觉 N1 和 P2 (通过 EEG 测量)调制在多大程度上表明语音特定的视听整合,以及我们探索了 ERPs 在多大程度上受到语音视听一致性的调制。为了将语音特定的(语音)整合与非语音整合分开,我们使用正弦波语音(Sine-Wave Speech,SWS),它被一半的参与者感知为语音(他们处于语音模式),而另一半处于非语音模式。结果表明,视听刺激引起的 N1 比仅听觉刺激引起的 N1 更早地达到峰值。这种唇读引起的 N1 加速发生在语音和非语音模式下的听众中。相比之下,如果听众处于语音模式,唇读语音也会调制听觉 P2,但如果听众处于非语音模式,则不会,从而揭示了语音特定的视听结合。将语音一致的视听刺激的 ERP 与语音不一致的刺激的 ERP 进行比较,揭示了语音刺激一致性的影响,该影响始于(语音)一致或不一致变得明显后约 200 毫秒。至关重要的是,与 P2 抑制类似,如果听众处于语音模式,则会观察到一致性效应,但如果他们处于非语音模式,则不会。因此,使用相同的刺激,我们确认视听结合涉及语音和非语音模式下声音处理的(部分)不同神经机制。

相似文献

1
Electrophysiological evidence for speech-specific audiovisual integration.言语特异性视听整合的电生理学证据。
Neuropsychologia. 2014 Jan;53:115-21. doi: 10.1016/j.neuropsychologia.2013.11.011. Epub 2013 Nov 27.
2
Electrophysiological evidence for a multisensory speech-specific mode of perception.电生理学证据表明存在一种多感官特异性言语感知模式。
Neuropsychologia. 2012 Jun;50(7):1425-31. doi: 10.1016/j.neuropsychologia.2012.02.027. Epub 2012 Mar 4.
3
Electrophysiological evidence for differences between fusion and combination illusions in audiovisual speech perception.电生理学证据表明,视听语音感知中的融合错觉和组合错觉存在差异。
Eur J Neurosci. 2017 Nov;46(10):2578-2583. doi: 10.1111/ejn.13734. Epub 2017 Nov 6.
4
The level of audiovisual print-speech integration deficits in dyslexia.阅读障碍中视听印刷文字与语音整合缺陷的程度。
Neuropsychologia. 2014 Sep;62:245-61. doi: 10.1016/j.neuropsychologia.2014.07.024. Epub 2014 Jul 30.
5
Changes in visually and auditory attended audiovisual speech processing in cochlear implant users: A longitudinal ERP study.人工耳蜗使用者视听言语加工中视听注意的变化:一项纵向 ERP 研究。
Hear Res. 2024 Jun;447:109023. doi: 10.1016/j.heares.2024.109023. Epub 2024 Apr 27.
6
Neural Mechanisms Underlying Cross-Modal Phonetic Encoding.跨模态语音编码的神经机制。
J Neurosci. 2018 Feb 14;38(7):1835-1849. doi: 10.1523/JNEUROSCI.1566-17.2017. Epub 2017 Dec 20.
7
Speech-specific audiovisual integration modulates induced theta-band oscillations.语音特异性视听整合调制诱导的 theta 波段振荡。
PLoS One. 2019 Jul 16;14(7):e0219744. doi: 10.1371/journal.pone.0219744. eCollection 2019.
8
Degradation of labial information modifies audiovisual speech perception in cochlear-implanted children.唇语信息的退化改变了植入人工耳蜗的儿童对视听语音的感知。
Ear Hear. 2013 Jan-Feb;34(1):110-21. doi: 10.1097/AUD.0b013e3182670993.
9
Neural correlates of multisensory integration of ecologically valid audiovisual events.生态有效视听事件多感官整合的神经关联
J Cogn Neurosci. 2007 Dec;19(12):1964-73. doi: 10.1162/jocn.2007.19.12.1964.
10
Perception of intersensory synchrony in audiovisual speech: not that special.视听语音中内感觉同步的感知:并非那么特殊。
Cognition. 2011 Jan;118(1):75-83. doi: 10.1016/j.cognition.2010.10.002. Epub 2010 Oct 29.

引用本文的文献

1
Neural correlates of audiovisual integration in schizophrenia - an ERP study.精神分裂症视听整合的神经关联——一项事件相关电位研究
Front Psychiatry. 2024 Dec 10;15:1492266. doi: 10.3389/fpsyt.2024.1492266. eCollection 2024.
2
Processing of Visual Speech Cues in Speech-in-Noise Comprehension Depends on Working Memory Capacity and Enhances Neural Speech Tracking in Older Adults With Hearing Impairment.视觉语音线索在噪声环境下言语理解中的处理依赖于工作记忆容量,并增强了听力障碍老年人的神经言语跟踪。
Trends Hear. 2024 Jan-Dec;28:23312165241287622. doi: 10.1177/23312165241287622.
3
Non-spatial inhibition of return attenuates audiovisual integration owing to modality disparities.
非空间返回抑制由于模态差异而减弱视听整合。
Atten Percept Psychophys. 2024 Oct;86(7):2315-2328. doi: 10.3758/s13414-023-02825-y. Epub 2023 Dec 20.
4
Hearing, seeing, and feeling speech: the neurophysiological correlates of trimodal speech perception.听觉、视觉和触觉语音:三模态语音感知的神经生理学关联
Front Hum Neurosci. 2023 Aug 29;17:1225976. doi: 10.3389/fnhum.2023.1225976. eCollection 2023.
5
The Processing of Audiovisual Speech Is Linked with Vocabulary in Autistic and Nonautistic Children: An ERP Study.自闭症和非自闭症儿童视听言语加工与词汇的关系:一项事件相关电位研究
Brain Sci. 2023 Jul 8;13(7):1043. doi: 10.3390/brainsci13071043.
6
The Effect of Cued-Speech (CS) Perception on Auditory Processing in Typically Hearing (TH) Individuals Who Are Either Naïve or Experienced CS Producers.提示语(CS)感知对从未接触过或有过CS使用经验的听力正常(TH)个体听觉处理的影响。
Brain Sci. 2023 Jul 7;13(7):1036. doi: 10.3390/brainsci13071036.
7
Audiovisual n-Back Training Alters the Neural Processes of Working Memory and Audiovisual Integration: Evidence of Changes in ERPs.视听n-回溯训练改变工作记忆和视听整合的神经过程:事件相关电位变化的证据
Brain Sci. 2023 Jun 24;13(7):992. doi: 10.3390/brainsci13070992.
8
Deficient Audiovisual Speech Perception in Schizophrenia: An ERP Study.精神分裂症患者视听言语感知缺陷:一项事件相关电位研究。
Brain Sci. 2023 Jun 19;13(6):970. doi: 10.3390/brainsci13060970.
9
Hierarchically nested networks optimize the analysis of audiovisual speech.分层嵌套网络优化了视听语音分析。
iScience. 2023 Feb 20;26(3):106257. doi: 10.1016/j.isci.2023.106257. eCollection 2023 Mar 17.
10
Incongruent visual cues affect the perception of Mandarin vowel but not tone.不一致的视觉线索会影响对普通话元音的感知,但不会影响声调。
Front Psychol. 2023 Jan 4;13:971979. doi: 10.3389/fpsyg.2022.971979. eCollection 2022.