• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

言语的视听感知

Auditory-visual perception of speech.

作者信息

Erber N P

出版信息

J Speech Hear Disord. 1975 Nov;40(4):481-92. doi: 10.1044/jshd.4004.481.

DOI:10.1044/jshd.4004.481
PMID:1234963
Abstract

Hearing-impaired persons usually perceive speech by watching the face of the talker while listening through a hearing aid. Normal-hearing persons also tend to rely on visual cues, especially when they communicate in noisy or reverberant environments. Numerous clinical and laboratory studies on the auditory-visual performance of normal-hearing and hearing-impaired children and adults demonstrate that combined auditory-visual perception is superior to perception through either audition or vision alone. This paper reviews these studies and provides a rationale for routine evaluation of auditory-visual speech perception in audiology clinics.

摘要

听力受损者通常在通过助听器聆听的同时,通过观察说话者的面部来感知言语。听力正常的人也倾向于依赖视觉线索,尤其是当他们在嘈杂或有回声的环境中交流时。关于听力正常和听力受损的儿童及成人的视听表现的大量临床和实验室研究表明,听觉-视觉联合感知优于单独通过听觉或视觉的感知。本文回顾了这些研究,并为听力学诊所常规评估听觉-视觉言语感知提供了理论依据。

相似文献

1
Auditory-visual perception of speech.言语的视听感知
J Speech Hear Disord. 1975 Nov;40(4):481-92. doi: 10.1044/jshd.4004.481.
2
Audiovisual integration and lipreading abilities of older adults with normal and impaired hearing.听力正常和受损的老年人的视听整合与唇读能力。
Ear Hear. 2007 Sep;28(5):656-68. doi: 10.1097/AUD.0b013e31812f7185.
3
Degradation of labial information modifies audiovisual speech perception in cochlear-implanted children.唇语信息的退化改变了植入人工耳蜗的儿童对视听语音的感知。
Ear Hear. 2013 Jan-Feb;34(1):110-21. doi: 10.1097/AUD.0b013e3182670993.
4
Lipreading and audio-visual speech perception.唇读与视听言语感知。
Philos Trans R Soc Lond B Biol Sci. 1992 Jan 29;335(1273):71-8. doi: 10.1098/rstb.1992.0009.
5
How hearing aids, background noise, and visual cues influence objective listening effort.助听设备、背景噪声和视觉线索如何影响客观听力努力程度。
Ear Hear. 2013 Sep;34(5):e52-64. doi: 10.1097/AUD.0b013e31827f0431.
6
Auditory-visual perception of speech with reduced optical clarity.
J Speech Hear Res. 1979 Jun;22(2):212-23. doi: 10.1044/jshr.2202.212.
7
Activation of auditory cortex during silent lipreading.默读唇语时听觉皮层的激活。
Science. 1997 Apr 25;276(5312):593-6. doi: 10.1126/science.276.5312.593.
8
Children use visual speech to compensate for non-intact auditory speech.儿童使用视觉言语来补偿不完整的听觉言语。
J Exp Child Psychol. 2014 Oct;126:295-312. doi: 10.1016/j.jecp.2014.05.003. Epub 2014 Jul 4.
9
Visual speech alters the discrimination and identification of non-intact auditory speech in children with hearing loss.视觉言语会改变听力损失儿童对不完整听觉言语的辨别和识别。
Int J Pediatr Otorhinolaryngol. 2017 Mar;94:127-137. doi: 10.1016/j.ijporl.2017.01.009. Epub 2017 Jan 9.
10
Integration efficiency for speech perception within and across sensory modalities by normal-hearing and hearing-impaired individuals.正常听力和听力受损个体在感觉模态内及跨感觉模态的语音感知整合效率。
J Acoust Soc Am. 2007 Feb;121(2):1164-76. doi: 10.1121/1.2405859.

引用本文的文献

1
Audio-Visual Speech Synchrony Impacts Gaze Patterns in Autism.视听语音同步影响自闭症患者的注视模式。
J Autism Dev Disord. 2025 Aug 22. doi: 10.1007/s10803-025-06998-3.
2
Vibrotactile speech cues are associated with enhanced auditory processing in middle and superior temporal gyri.振动触觉语音线索与颞中回和颞上回听觉处理增强有关。
Sci Rep. 2025 Jul 12;15(1):25202. doi: 10.1038/s41598-025-07718-8.
3
Seeing a Talker's Mouth Reduces the Effort of Perceiving Speech and Repairing Perceptual Mistakes for Listeners With Cochlear Implants.
看到说话者的嘴部动作可减轻人工耳蜗佩戴者感知语音和纠正感知错误的难度。
Ear Hear. 2025 Jun 16. doi: 10.1097/AUD.0000000000001683.
4
Pupil Responses During Interactive Conversation.互动对话期间的瞳孔反应。
Trends Hear. 2025 Jan-Dec;29:23312165251342441. doi: 10.1177/23312165251342441. Epub 2025 May 14.
5
Neural Speech Tracking Contribution of Lip Movements Predicts Behavioral Deterioration When the Speaker's Mouth Is Occluded.唇部运动对神经语音追踪的贡献可预测说话者嘴巴被遮挡时的行为恶化。
eNeuro. 2025 Feb 5;12(2). doi: 10.1523/ENEURO.0368-24.2024. Print 2025 Feb.
6
When Hearing Lips and Seeing Voices Becomes Perceiving Speech: Auditory-Visual Integration in Lexical Access.当“听唇声”与“看语音”变成“感知言语”:词汇提取中的视听整合
Cogsci. 2011;33:1376-1381.
7
Synthetic faces generated with the facial action coding system or deep neural networks improve speech-in-noise perception, but not as much as real faces.通过面部动作编码系统或深度神经网络生成的合成面孔可改善噪声环境下的语音感知,但效果不如真实面孔。
Front Neurosci. 2024 May 9;18:1379988. doi: 10.3389/fnins.2024.1379988. eCollection 2024.
8
A standardised test to evaluate audio-visual speech intelligibility in French.一项评估法语视听语音清晰度的标准化测试。
Heliyon. 2024 Jan 14;10(2):e24750. doi: 10.1016/j.heliyon.2024.e24750. eCollection 2024 Jan 30.
9
Attention Drives Visual Processing and Audiovisual Integration During Multimodal Communication.注意驱动多模态交流中的视觉处理和视听整合。
J Neurosci. 2024 Mar 6;44(10):e0870232023. doi: 10.1523/JNEUROSCI.0870-23.2023.
10
The effects of temporal cues, point-light displays, and faces on speech identification and listening effort.时线索、点光显示和面部对言语识别和听力努力的影响。
PLoS One. 2023 Nov 29;18(11):e0290826. doi: 10.1371/journal.pone.0290826. eCollection 2023.