• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

眼动追踪与视听信息受损语音的感知适应

Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech.

机构信息

Division of Neuroscience and Experimental Psychology, Faculty of Biology, Medicine and Health, The University of Manchester, United Kingdom.

Manchester Centre for Audiology and Deafness, Faculty of Biology, Medicine and Health, The University of Manchester, United Kingdom.

出版信息

J Speech Lang Hear Res. 2021 Sep 14;64(9):3432-3445. doi: 10.1044/2021_JSLHR-21-00106. Epub 2021 Aug 31.

DOI:10.1044/2021_JSLHR-21-00106
PMID:34463528
Abstract

Purpose Visual cues from a speaker's face may benefit perceptual adaptation to degraded speech, but current evidence is limited. We aimed to replicate results from previous studies to establish the extent to which visual speech cues can lead to greater adaptation over time, extending existing results to a real-time adaptation paradigm (i.e., without a separate training period). A second aim was to investigate whether eye gaze patterns toward the speaker's mouth were related to better perception, hypothesizing that listeners who looked more at the speaker's mouth would show greater adaptation. Method A group of listeners ( = 30) was presented with 90 noise-vocoded sentences in audiovisual format, whereas a control group ( = 29) was presented with the audio signal only. Recognition accuracy was measured throughout and eye tracking was used to measure fixations toward the speaker's eyes and mouth in the audiovisual group. Results Previous studies were partially replicated: The audiovisual group had better recognition throughout and adapted slightly more rapidly, but both groups showed an equal amount of improvement overall. Longer fixations on the speaker's mouth in the audiovisual group were related to better overall accuracy. An exploratory analysis further demonstrated that the duration of fixations to the speaker's mouth decreased over time. Conclusions The results suggest that visual cues may not benefit adaptation to degraded speech as much as previously thought. Longer fixations on a speaker's mouth may play a role in successfully decoding visual speech cues; however, this will need to be confirmed in future research to fully understand how patterns of eye gaze are related to audiovisual speech recognition. All materials, data, and code are available at https://osf.io/2wqkf/.

摘要

目的

说话人的面部视觉线索可能有助于感知适应语音质量下降,但目前的证据有限。我们旨在复制先前研究的结果,以确定视觉言语线索在多大程度上可以随着时间的推移导致更大的适应,将现有结果扩展到实时适应范式(即没有单独的训练期)。第二个目的是研究注视说话人嘴巴的眼球运动模式是否与更好的感知相关,假设更多注视说话人嘴巴的听众会表现出更大的适应。

方法

一组听众(n=30)以视听格式呈现 90 个噪声编码句子,而对照组(n=29)仅呈现音频信号。在整个过程中测量识别准确性,并使用眼动追踪测量视听组中注视说话人眼睛和嘴巴的注视次数。

结果

部分复制了先前的研究:视听组在整个过程中的识别率更高,适应速度略快,但两组的总体改善程度相同。视听组中对说话人嘴巴的注视时间更长与整体准确性更高相关。进一步的探索性分析表明,注视说话人嘴巴的时间随着时间的推移而减少。

结论

结果表明,视觉线索可能不像以前认为的那样有益于适应语音质量下降。对说话人嘴巴的长时间注视可能在成功解码视觉言语线索方面发挥作用;然而,这需要在未来的研究中得到证实,以充分了解眼球运动模式与视听言语识别的关系。所有材料、数据和代码均可在 https://osf.io/2wqkf/ 上获得。

相似文献

1
Eye Gaze and Perceptual Adaptation to Audiovisual Degraded Speech.眼动追踪与视听信息受损语音的感知适应
J Speech Lang Hear Res. 2021 Sep 14;64(9):3432-3445. doi: 10.1044/2021_JSLHR-21-00106. Epub 2021 Aug 31.
2
Audiovisual cues benefit recognition of accented speech in noise but not perceptual adaptation.视听线索有助于在噪声中识别带口音的语音,但无助于感知适应。
Front Hum Neurosci. 2015 Aug 3;9:422. doi: 10.3389/fnhum.2015.00422. eCollection 2015.
3
The Relevance of the Availability of Visual Speech Cues During Adaptation to Noise-Vocoded Speech.在适应噪声语音编码的过程中,可视语音线索的可用性的相关性。
J Speech Lang Hear Res. 2021 Jul 16;64(7):2513-2528. doi: 10.1044/2021_JSLHR-20-00575. Epub 2021 Jun 23.
4
Increasing audiovisual speech integration in autism through enhanced attention to mouth.通过增强对嘴巴的注意力,增加自闭症患者的视听言语整合。
Dev Sci. 2023 Jul;26(4):e13348. doi: 10.1111/desc.13348. Epub 2022 Dec 1.
5
Visual fixations during processing of time-compressed audiovisual presentations.处理时频压缩视听演示过程中的视觉注视。
Atten Percept Psychophys. 2024 Feb;86(2):367-372. doi: 10.3758/s13414-023-02838-7. Epub 2024 Jan 4.
6
Does the speaker's eye gaze facilitate infants' word segmentation from continuous speech? An ERP study.说话者的目光注视是否有助于婴儿从连续语音中进行单词切分?一项事件相关电位研究。
Dev Sci. 2024 Mar;27(2):e13436. doi: 10.1111/desc.13436. Epub 2023 Aug 8.
7
Culture and listeners' gaze responses to stuttering.文化与听众对口吃者的注视反应。
Int J Lang Commun Disord. 2012 Jul-Aug;47(4):388-97. doi: 10.1111/j.1460-6984.2012.00152.x. Epub 2012 May 28.
8
Psychobiological Responses Reveal Audiovisual Noise Differentially Challenges Speech Recognition.心理生物学反应表明视听噪声对语音识别的挑战存在差异。
Ear Hear. 2020 Mar/Apr;41(2):268-277. doi: 10.1097/AUD.0000000000000755.
9
Face-viewing patterns predict audiovisual speech integration in autistic children.面部观察模式可预测自闭症儿童的视听言语整合。
Autism Res. 2021 Dec;14(12):2592-2602. doi: 10.1002/aur.2598. Epub 2021 Aug 20.
10
When eyes beat lips: speaker gaze affects audiovisual integration in the McGurk illusion.当眼睛打败嘴唇:说话者注视影响 McGurk 错觉中的视听整合。
Psychol Res. 2022 Sep;86(6):1930-1943. doi: 10.1007/s00426-021-01618-y. Epub 2021 Dec 2.

引用本文的文献

1
Perceptual Learning of Noise-Vocoded Speech Under Divided Attention.分散注意条件下噪声语音的感知学习。
Trends Hear. 2023 Jan-Dec;27:23312165231192297. doi: 10.1177/23312165231192297.
2
Event-Related Potentials in Assessing Visual Speech Cues in the Broader Autism Phenotype: Evidence from a Phonemic Restoration Paradigm.评估泛自闭症表型中视觉言语线索时的事件相关电位:来自音素恢复范式的证据
Brain Sci. 2023 Jun 30;13(7):1011. doi: 10.3390/brainsci13071011.
3
Neural and Behavioral Differences in Speech Perception for Children With Autism Spectrum Disorders Within an Audiovisual Context.
自闭症谱系障碍儿童在视听语境下的言语感知中的神经和行为差异。
J Speech Lang Hear Res. 2023 Jul 12;66(7):2390-2403. doi: 10.1044/2023_JSLHR-22-00661. Epub 2023 Jun 30.
4
Where on the face do we look during phonemic restoration: An eye-tracking study.在音素恢复过程中我们会看向脸部的哪个部位:一项眼动追踪研究。
Front Psychol. 2023 May 25;14:1005186. doi: 10.3389/fpsyg.2023.1005186. eCollection 2023.
5
How do face masks impact communication amongst deaf/HoH people?口罩对面部失聪/听障人士的交流有何影响?
Cogn Res Princ Implic. 2022 Sep 5;7(1):81. doi: 10.1186/s41235-022-00431-4.