• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

视听持续不同步:通过对唇音整合的内隐测量揭示的听觉视觉同步的稳定个体差异。

Sight and sound persistently out of synch: stable individual differences in audiovisual synchronisation revealed by implicit measures of lip-voice integration.

机构信息

City, University of London, London, UK.

University of Sussex, Falmer, UK.

出版信息

Sci Rep. 2017 Apr 21;7:46413. doi: 10.1038/srep46413.

DOI:10.1038/srep46413
PMID:28429784
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5399466/
Abstract

Are sight and sound out of synch? Signs that they are have been dismissed for over two centuries as an artefact of attentional and response bias, to which traditional subjective methods are prone. To avoid such biases, we measured performance on objective tasks that depend implicitly on achieving good lip-synch. We measured the McGurk effect (in which incongruent lip-voice pairs evoke illusory phonemes), and also identification of degraded speech, while manipulating audiovisual asynchrony. Peak performance was found at an average auditory lag of ~100 ms, but this varied widely between individuals. Participants' individual optimal asynchronies showed trait-like stability when the same task was re-tested one week later, but measures based on different tasks did not correlate. This discounts the possible influence of common biasing factors, suggesting instead that our different tasks probe different brain networks, each subject to their own intrinsic auditory and visual processing latencies. Our findings call for renewed interest in the biological causes and cognitive consequences of individual sensory asynchronies, leading potentially to fresh insights into the neural representation of sensory timing. A concrete implication is that speech comprehension might be enhanced, by first measuring each individual's optimal asynchrony and then applying a compensatory auditory delay.

摘要

视听觉不同步?这种现象在过去两个多世纪以来一直被认为是注意力和反应偏差的产物,而传统的主观方法很容易受到这种偏差的影响。为了避免这种偏差,我们使用了依赖于良好唇音同步的客观任务来衡量表现。我们测量了麦格克效应(不一致的唇音-语音对会产生虚幻的语音),以及在操纵视听不同步的情况下识别语音的能力。在平均听觉滞后约 100ms 时达到最佳表现,但个体之间差异很大。参与者在一周后重新测试相同任务时,个体最佳异步的表现具有类似特征的稳定性,但基于不同任务的测量结果没有相关性。这排除了常见偏置因素的可能影响,表明我们的不同任务探测到不同的大脑网络,每个网络都受到自己内在的听觉和视觉处理延迟的影响。我们的发现呼吁重新关注个体感觉不同步的生物学原因和认知后果,这可能会为感觉时间的神经表示提供新的见解。一个具体的含义是,通过首先测量每个人的最佳异步,然后应用补偿听觉延迟,可以增强语音理解。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d21/5399466/c90e8c0e6bc6/srep46413-f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d21/5399466/c0384564a2d9/srep46413-f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d21/5399466/840263a58cf9/srep46413-f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d21/5399466/1a6109bb3f83/srep46413-f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d21/5399466/c90e8c0e6bc6/srep46413-f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d21/5399466/c0384564a2d9/srep46413-f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d21/5399466/840263a58cf9/srep46413-f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d21/5399466/1a6109bb3f83/srep46413-f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d21/5399466/c90e8c0e6bc6/srep46413-f4.jpg

相似文献

1
Sight and sound persistently out of synch: stable individual differences in audiovisual synchronisation revealed by implicit measures of lip-voice integration.视听持续不同步:通过对唇音整合的内隐测量揭示的听觉视觉同步的稳定个体差异。
Sci Rep. 2017 Apr 21;7:46413. doi: 10.1038/srep46413.
2
Sight and sound out of synch: fragmentation and renormalisation of audiovisual integration and subjective timing.视听觉不同步:视听整合与主观时间的碎片化和重正化。
Cortex. 2013 Nov-Dec;49(10):2875-87. doi: 10.1016/j.cortex.2013.03.006. Epub 2013 Apr 1.
3
Correlation of individual differences in audiovisual asynchrony across stimuli and tasks: New constraints on temporal renormalization theory.跨刺激和任务的视听异步个体差异的相关性:对时间重整化理论的新限制
J Exp Psychol Hum Percept Perform. 2018 Aug;44(8):1283-1293. doi: 10.1037/xhp0000535. Epub 2018 May 7.
4
Automatic audiovisual integration in speech perception.言语感知中的自动视听整合。
Exp Brain Res. 2005 Nov;167(1):66-75. doi: 10.1007/s00221-005-0008-z. Epub 2005 Oct 29.
5
The role of audiovisual asynchrony in person recognition.视听异步在人脸识别中的作用。
Q J Exp Psychol (Hove). 2010 Jan;63(1):23-30. doi: 10.1080/17470210903144376. Epub 2009 Aug 10.
6
Degradation of labial information modifies audiovisual speech perception in cochlear-implanted children.唇语信息的退化改变了植入人工耳蜗的儿童对视听语音的感知。
Ear Hear. 2013 Jan-Feb;34(1):110-21. doi: 10.1097/AUD.0b013e3182670993.
7
Attention to touch weakens audiovisual speech integration.对触觉的关注会削弱视听言语整合。
Exp Brain Res. 2007 Nov;183(3):399-404. doi: 10.1007/s00221-007-1110-1.
8
Electrophysiological evidence for speech-specific audiovisual integration.言语特异性视听整合的电生理学证据。
Neuropsychologia. 2014 Jan;53:115-21. doi: 10.1016/j.neuropsychologia.2013.11.011. Epub 2013 Nov 27.
9
Audiovisual perception of congruent and incongruent Dutch front vowels.荷兰语前元音的和谐与不和谐的视听感知。
J Speech Lang Hear Res. 2012 Dec;55(6):1788-801. doi: 10.1044/1092-4388(2012/11-0227). Epub 2012 Sep 19.
10
Top-down attention regulates the neural expression of audiovisual integration.自上而下的注意力调节视听整合的神经表达。
Neuroimage. 2015 Oct 1;119:272-85. doi: 10.1016/j.neuroimage.2015.06.052. Epub 2015 Jun 26.

引用本文的文献

1
Sensory experience during early sensitive periods shapes cross-modal temporal biases.早期敏感期的感觉体验塑造了跨模态的时间偏差。
Elife. 2020 Aug 25;9:e61238. doi: 10.7554/eLife.61238.
2
The perceived present: What is it, and what is it there for?知觉的当下:它是什么,它存在的目的是什么?
Psychon Bull Rev. 2020 Aug;27(4):583-601. doi: 10.3758/s13423-020-01726-7.
3
Alpha Activity Reflects the Magnitude of an Individual Bias in Human Perception.α 活动反映了个体在人类感知中的偏差程度。

本文引用的文献

1
A Roving Dual-Presentation Simultaneity-Judgment Task to Estimate the Point of Subjective Simultaneity.一种用于估计主观同时性点的移动双呈现同时性判断任务。
Front Psychol. 2016 Mar 24;7:416. doi: 10.3389/fpsyg.2016.00416. eCollection 2016.
2
Distinct cortical locations for integration of audiovisual speech and the McGurk effect.视听语音整合和麦格克效应的皮质位置不同。
Front Psychol. 2014 Jun 2;5:534. doi: 10.3389/fpsyg.2014.00534. eCollection 2014.
3
The recalibration patterns of perceptual synchrony and multisensory integration after exposure to asynchronous speech.
J Neurosci. 2020 Apr 22;40(17):3443-3454. doi: 10.1523/JNEUROSCI.2359-19.2020. Epub 2020 Mar 16.
4
Judging Relative Onsets and Offsets of Audiovisual Events.判断视听事件的相对起始和结束
Vision (Basel). 2020 Mar 3;4(1):17. doi: 10.3390/vision4010017.
5
Audiovisual Temporal Processing in Postlingually Deafened Adults with Cochlear Implants.人工耳蜗植入后聋人成年人的视听时程处理。
Sci Rep. 2018 Jul 27;8(1):11345. doi: 10.1038/s41598-018-29598-x.
6
Temporal Audiovisual Motion Prediction in 2D- vs. 3D-Environments.二维与三维环境中的时间视听运动预测
Front Psychol. 2018 Mar 21;9:368. doi: 10.3389/fpsyg.2018.00368. eCollection 2018.
接触异步语音后感知同步和多感官整合的重新校准模式。
Neurosci Lett. 2014 May 21;569:148-52. doi: 10.1016/j.neulet.2014.03.057. Epub 2014 Apr 3.
4
Encoding of event timing in the phase of neural oscillations.神经振荡相位中事件时间的编码。
Neuroimage. 2014 May 15;92:274-84. doi: 10.1016/j.neuroimage.2014.02.010. Epub 2014 Feb 13.
5
Multisensory temporal integration in autism spectrum disorders.自闭症谱系障碍中的多感觉时间整合。
J Neurosci. 2014 Jan 15;34(3):691-7. doi: 10.1523/JNEUROSCI.3615-13.2014.
6
The roles of physical and physiological simultaneity in audiovisual multisensory facilitation.物理同步性和生理同步性在视听多感官促进中的作用。
Iperception. 2013 Jun 3;4(4):213-28. doi: 10.1068/i0532. eCollection 2013.
7
Sight and sound out of synch: fragmentation and renormalisation of audiovisual integration and subjective timing.视听觉不同步:视听整合与主观时间的碎片化和重正化。
Cortex. 2013 Nov-Dec;49(10):2875-87. doi: 10.1016/j.cortex.2013.03.006. Epub 2013 Apr 1.
8
A psychophysical investigation of differences between synchrony and temporal order judgments.同步和时序判断差异的心理物理学研究。
PLoS One. 2013;8(1):e54798. doi: 10.1371/journal.pone.0054798. Epub 2013 Jan 21.
9
Time and the brain: neurorelativity: The chronoarchitecture of the brain from the neuronal rather than the observer's perspective.时间与大脑:神经相对论:从神经元而不是观察者的角度看大脑的计时架构。
Trends Cogn Sci. 2013 Feb;17(2):51-2. doi: 10.1016/j.tics.2012.12.005. Epub 2013 Jan 12.
10
Binding and unbinding the auditory and visual streams in the McGurk effect.麦格克效应中听觉和视觉流的结合与分离。
J Acoust Soc Am. 2012 Aug;132(2):1061-77. doi: 10.1121/1.4728187.