• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

言语感知过程中的序列视听交互作用:一项全脑磁图研究。

Sequential audiovisual interactions during speech perception: a whole-head MEG study.

作者信息

Hertrich Ingo, Mathiak Klaus, Lutzenberger Werner, Menning Hans, Ackermann Hermann

机构信息

Department of General Neurology, Hertie Institute for Clinical Brain Research, University of Tübingen, Germany.

出版信息

Neuropsychologia. 2007 Mar 25;45(6):1342-54. doi: 10.1016/j.neuropsychologia.2006.09.019. Epub 2006 Oct 25.

DOI:10.1016/j.neuropsychologia.2006.09.019
PMID:17067640
Abstract

Using whole-head magnetoencephalography (MEG), audiovisual (AV) interactions during speech perception (/ta/- and /pa/-syllables) were investigated in 20 subjects. Congruent AV events served as the 'standards' of an oddball design. The deviants encompassed incongruent /ta/-/pa/ configurations differing from the standards either in the acoustic or the visual domain. As an auditory non-speech control condition, the same video signals were synchronized with either one of two complex tones. As in natural speech, visual movement onset preceded acoustic signals by about 150 ms. First, the impact of visual information on auditorily evoked fields to non-speech sounds was determined. Larger facial movements (/pa/ versus /ta/) yielded enhanced early responses such as the M100 component, indicating, most presumably, anticipatory pre-activation of auditory cortex by visual motion cues. As a second step of analysis, mismatch fields (MMF) were calculated. Acoustic deviants elicited a typical MMF, peaking ca. 180 ms after stimulus onset, whereas visual deviants gave rise to later responses (220 ms) of a more posterior-medial source location. Finally, a late (275 ms), left-lateralized visually-induced MMF component, resembling the acoustic mismatch response, emerged during the speech condition, presumably reflecting phonetic/linguistic operations. There is mounting functional imaging evidence for an early impact of visual information on auditory cortical regions during speech perception. The present study suggests at least two successive AV interactions in association with syllable recognition tasks: early activation of auditory areas depending upon visual motion cues and a later speech-specific left-lateralized response mediated, conceivably, by backward-projections from multisensory areas.

摘要

利用全脑磁脑图(MEG),对20名受试者在言语感知(/ta/和/pa/音节)过程中的视听(AV)交互作用进行了研究。一致的AV事件作为oddball设计的“标准”。偏差刺激包括在声学或视觉领域与标准不同的不一致的/ta//pa/配置。作为听觉非言语控制条件,相同的视频信号与两种复合音调之一同步。与自然言语一样,视觉运动开始比声学信号提前约150毫秒。首先,确定视觉信息对非言语声音诱发的听觉场的影响。更大的面部运动(/pa/与/ta/)产生增强的早期反应,如M100成分,这很可能表明视觉运动线索对听觉皮层的预期预激活。作为分析的第二步,计算失配场(MMF)。声学偏差刺激引发了典型的MMF,在刺激开始后约180毫秒达到峰值,而视觉偏差刺激则引发了更靠后内侧源位置的较晚反应(220毫秒)。最后,在言语条件下出现了一个晚期(275毫秒)、左侧化的视觉诱发MMF成分,类似于声学失配反应,这可能反映了语音/语言操作。越来越多的功能成像证据表明,在言语感知过程中,视觉信息对听觉皮层区域有早期影响。本研究表明,与音节识别任务相关的至少有两个连续的AV交互作用:依赖视觉运动线索的听觉区域早期激活,以及可能由多感觉区域的反向投射介导的后期言语特异性左侧化反应。

相似文献

1
Sequential audiovisual interactions during speech perception: a whole-head MEG study.言语感知过程中的序列视听交互作用:一项全脑磁图研究。
Neuropsychologia. 2007 Mar 25;45(6):1342-54. doi: 10.1016/j.neuropsychologia.2006.09.019. Epub 2006 Oct 25.
2
Time course of early audiovisual interactions during speech and nonspeech central auditory processing: a magnetoencephalography study.言语和非言语中枢听觉处理过程中早期视听交互的时间进程:一项脑磁图研究。
J Cogn Neurosci. 2009 Feb;21(2):259-74. doi: 10.1162/jocn.2008.21019.
3
Cross-modal interactions during perception of audiovisual speech and nonspeech signals: an fMRI study.听觉-视觉语音和非语音信号感知过程中的跨模态相互作用:一项 fMRI 研究。
J Cogn Neurosci. 2011 Jan;23(1):221-37. doi: 10.1162/jocn.2010.21421.
4
Time course of multisensory interactions during audiovisual speech perception in humans: a magnetoencephalographic study.人类视听言语感知过程中多感官交互的时间进程:一项脑磁图研究。
Neurosci Lett. 2004 Jun 10;363(2):112-5. doi: 10.1016/j.neulet.2004.03.076.
5
Hearing lips: gamma-band activity during audiovisual speech perception.听觉唇动:视听语音感知过程中的伽马波段活动。
Cereb Cortex. 2005 May;15(5):646-53. doi: 10.1093/cercor/bhh166. Epub 2004 Sep 1.
6
Selective influences of cross-modal spatial-cues on preattentive auditory processing: a whole-head magnetoencephalography study.跨模态空间线索对前注意听觉加工的选择性影响:一项全脑磁脑图研究。
Neuroimage. 2005 Nov 15;28(3):627-34. doi: 10.1016/j.neuroimage.2005.06.030. Epub 2005 Jul 28.
7
Multisensory integration sites identified by perception of spatial wavelet filtered visual speech gesture information.通过对空间小波滤波视觉语音手势信息的感知识别出的多感官整合位点。
J Cogn Neurosci. 2004 Jun;16(5):805-16. doi: 10.1162/089892904970771.
8
Disentangling the effects of phonation and articulation: hemispheric asymmetries in the auditory N1m response of the human brain.区分发声和发音的影响:人类大脑听觉N1m反应中的半球不对称性。
BMC Neurosci. 2005 Oct 15;6:62. doi: 10.1186/1471-2202-6-62.
9
The effect of viewing speech on auditory speech processing is different in the left and right hemispheres.观看言语对听觉言语加工的影响在左右半球有所不同。
Brain Res. 2008 Nov 25;1242:151-61. doi: 10.1016/j.brainres.2008.04.077. Epub 2008 May 11.
10
Gamma-band activity over early sensory areas predicts detection of changes in audiovisual speech stimuli.早期感觉区域的伽马波段活动可预测对视听言语刺激变化的觉察。
Neuroimage. 2006 May 1;30(4):1376-82. doi: 10.1016/j.neuroimage.2005.10.042. Epub 2005 Dec 20.

引用本文的文献

1
Interpretation of Social Interactions: Functional Imaging of Cognitive-Semiotic Categories During Naturalistic Viewing.社会互动的解读:自然观看过程中认知符号类别的功能成像
Front Hum Neurosci. 2018 Aug 14;12:296. doi: 10.3389/fnhum.2018.00296. eCollection 2018.
2
Perceived Conventionality in Co-speech Gestures Involves the Fronto-Temporal Language Network.共同言语手势中的感知常规性涉及额颞叶语言网络。
Front Hum Neurosci. 2017 Nov 30;11:573. doi: 10.3389/fnhum.2017.00573. eCollection 2017.
3
Audiovisual integration for speech during mid-childhood: electrophysiological evidence.
童年中期语音的视听整合:电生理学证据
Brain Lang. 2014 Dec;139:36-48. doi: 10.1016/j.bandl.2014.09.011. Epub 2014 Oct 24.
4
The sound of your lips: electrophysiological cross-modal interactions during hand-to-face and face-to-face speech perception.嘴唇的声音:手脸和面对面言语感知过程中的电生理跨模态相互作用。
Front Psychol. 2014 May 13;5:420. doi: 10.3389/fpsyg.2014.00420. eCollection 2014.
5
How can audiovisual pathways enhance the temporal resolution of time-compressed speech in blind subjects?视听通路如何提高盲人受试者对时间压缩语音的时间分辨率?
Front Psychol. 2013 Aug 16;4:530. doi: 10.3389/fpsyg.2013.00530. eCollection 2013.
6
Lipreading and covert speech production similarly modulate human auditory-cortex responses to pure tones.唇读和隐性言语产生同样调节人类听觉皮层对纯音的反应。
J Neurosci. 2010 Jan 27;30(4):1314-21. doi: 10.1523/JNEUROSCI.1950-09.2010.
7
Dual neural routing of visual facilitation in speech processing.言语处理中视觉促进的双重神经通路
J Neurosci. 2009 Oct 28;29(43):13445-53. doi: 10.1523/JNEUROSCI.3194-09.2009.
8
Audiovisual integration during speech comprehension: an fMRI study comparing ROI-based and whole brain analyses.言语理解过程中的视听整合:一项比较基于感兴趣区和全脑分析的功能磁共振成像研究。
Hum Brain Mapp. 2009 Jul;30(7):1990-9. doi: 10.1002/hbm.20640.
9
Look who's talking: the deployment of visuo-spatial attention during multisensory speech processing under noisy environmental conditions.看看是谁在说话:嘈杂环境下多感官语音处理过程中视觉空间注意力的调配。
Neuroimage. 2008 Nov 1;43(2):379-87. doi: 10.1016/j.neuroimage.2008.06.046. Epub 2008 Jul 18.