• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

相似文献

1
Engaging the Articulators Enhances Perception of Concordant Visible Speech Movements.使构音器官运动能增强对一致的可视言语运动的感知。
J Speech Lang Hear Res. 2019 Oct 25;62(10):3679-3688. doi: 10.1044/2019_JSLHR-S-19-0167. Epub 2019 Oct 2.
2
Human Sensorimotor Cortex Control of Directly Measured Vocal Tract Movements during Vowel Production.人类感觉运动皮层对元音产生期间直接测量的声道运动的控制。
J Neurosci. 2018 Mar 21;38(12):2955-2966. doi: 10.1523/JNEUROSCI.2382-17.2018. Epub 2018 Feb 8.
3
Asymmetries in unimodal visual vowel perception: The roles of oral-facial kinematics, orientation, and configuration.单峰视觉元音感知中的不对称性:口腔面部运动学、方向和构型的作用。
J Exp Psychol Hum Percept Perform. 2018 Jul;44(7):1103-1118. doi: 10.1037/xhp0000518. Epub 2018 Mar 8.
4
Rapid change in articulatory lip movement induced by preceding auditory feedback during production of bilabial plosives.在双唇爆破音的产生过程中,前导听觉反馈引起的发音唇运动的快速变化。
PLoS One. 2010 Nov 8;5(11):e13866. doi: 10.1371/journal.pone.0013866.
5
Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions.内心言语:舌头和嘴唇言语动作的多感官及特定模态处理
J Cogn Neurosci. 2017 Mar;29(3):448-466. doi: 10.1162/jocn_a_01057. Epub 2016 Oct 19.
6
Phase relations of jaw and tongue tip movements in the production of VCV utterances.VCV 话语生成过程中下颌与舌尖运动的相位关系。
J Acoust Soc Am. 1991 Oct;90(4 Pt 1):1806-15. doi: 10.1121/1.401661.
7
Production of bite-block vowels: acoustic equivalence by selective compensation.咬块元音的产生:通过选择性补偿实现声学等效。
J Acoust Soc Am. 1981 Mar;69(3):802-10. doi: 10.1121/1.385591.
8
Recalibration of auditory perception of speech due to orofacial somatosensory inputs during speech motor adaptation.由于言语运动适应期间口面躯体感觉输入,听觉感知言语的重新校准。
J Neurophysiol. 2019 Nov 1;122(5):2076-2084. doi: 10.1152/jn.00028.2019. Epub 2019 Sep 11.
9
The timing of articulatory gestures: evidence for relational invariants.发音手势的时间安排:关系不变性的证据。
J Acoust Soc Am. 1984 Oct;76(4):1030-6. doi: 10.1121/1.391421.
10
Visual Context Enhanced: The Joint Contribution of Iconic Gestures and Visible Speech to Degraded Speech Comprehension.视觉语境增强:标志性手势与可视语音对言语理解能力下降的联合作用
J Speech Lang Hear Res. 2017 Jan 1;60(1):212-222. doi: 10.1044/2016_JSLHR-H-16-0101.

引用本文的文献

1
The Effect of Somatosensory Input on Word Recognition in Typical Children and Those With Speech Sound Disorder.感觉输入对典型儿童和言语障碍儿童单词识别的影响。
J Speech Lang Hear Res. 2023 Jan 12;66(1):84-97. doi: 10.1044/2022_JSLHR-22-00226. Epub 2023 Jan 5.
2
Neurophysiological Correlates of Asymmetries in Vowel Perception: An English-French Cross-Linguistic Event-Related Potential Study.元音感知不对称的神经生理学关联:一项英法跨语言事件相关电位研究。
Front Hum Neurosci. 2021 Jun 3;15:607148. doi: 10.3389/fnhum.2021.607148. eCollection 2021.
3
Neural indicators of articulator-specific sensorimotor influences on infant speech perception.神经指标显示,构音器官特异性的感觉运动影响婴儿言语感知。
Proc Natl Acad Sci U S A. 2021 May 18;118(20). doi: 10.1073/pnas.2025043118.
4
When Additional Training Isn't Enough: Further Evidence That Unpredictable Speech Inhibits Adaptation.当额外的训练还不够时:进一步证明不可预测的语音会抑制适应。
J Speech Lang Hear Res. 2020 Jun 22;63(6):1700-1711. doi: 10.1044/2020_JSLHR-19-00380. Epub 2020 May 20.

本文引用的文献

1
Asymmetric discrimination of nonspeech tonal analogues of vowels.元音非语音音调类似物的不对称辨别
J Exp Psychol Hum Percept Perform. 2019 Feb;45(2):285-300. doi: 10.1037/xhp0000603. Epub 2018 Dec 20.
2
Asymmetries in unimodal visual vowel perception: The roles of oral-facial kinematics, orientation, and configuration.单峰视觉元音感知中的不对称性:口腔面部运动学、方向和构型的作用。
J Exp Psychol Hum Percept Perform. 2018 Jul;44(7):1103-1118. doi: 10.1037/xhp0000518. Epub 2018 Mar 8.
3
Sensorimotor adaptation of voice fundamental frequency in Parkinson's disease.帕金森病中语音基频的感觉运动适应性
PLoS One. 2018 Jan 26;13(1):e0191839. doi: 10.1371/journal.pone.0191839. eCollection 2018.
4
A universal bias in adult vowel perception - By ear or by eye.成人元音感知中的一种普遍偏差——通过听觉还是视觉。
Cognition. 2017 Sep;166:358-370. doi: 10.1016/j.cognition.2017.06.001. Epub 2017 Jun 8.
5
Articulating What Infants Attune to in Native Speech.阐明婴儿对母语语音的感知内容。
Ecol Psychol. 2016 Oct 1;28(4):216-261. doi: 10.1080/10407413.2016.1230372. Epub 2016 Nov 1.
6
Sensorimotor control of vocal pitch and formant frequencies in Parkinson's disease.帕金森病中嗓音音高和共振峰频率的感觉运动控制
Brain Res. 2016 Sep 1;1646:269-277. doi: 10.1016/j.brainres.2016.06.013. Epub 2016 Jun 8.
7
Sensorimotor influences on speech perception in infancy.感觉运动对婴儿期言语感知的影响。
Proc Natl Acad Sci U S A. 2015 Nov 3;112(44):13531-6. doi: 10.1073/pnas.1508631112. Epub 2015 Oct 12.
8
Infants' brain responses to speech suggest analysis by synthesis.婴儿对言语的大脑反应表明了分析综合的作用。
Proc Natl Acad Sci U S A. 2014 Aug 5;111(31):11238-45. doi: 10.1073/pnas.1410963111. Epub 2014 Jul 14.
9
Information for coarticulation: Static signal properties or formant dynamics?协同发音的信息:静态信号属性还是共振峰动态变化?
J Exp Psychol Hum Percept Perform. 2014 Jun;40(3):1228-36. doi: 10.1037/a0036214. Epub 2014 Apr 14.
10
Audiovisual speech integration does not rely on the motor system: evidence from articulatory suppression, the McGurk effect, and fMRI.视听言语整合并不依赖于运动系统:来自发音抑制、麦格克效应和 fMRI 的证据。
J Cogn Neurosci. 2014 Mar;26(3):606-20. doi: 10.1162/jocn_a_00515. Epub 2013 Nov 18.

使构音器官运动能增强对一致的可视言语运动的感知。

Engaging the Articulators Enhances Perception of Concordant Visible Speech Movements.

机构信息

Department of Speech, Language and Hearing Sciences, Boston University, MA.

Department of Biomedical Engineering, Boston University, MA.

出版信息

J Speech Lang Hear Res. 2019 Oct 25;62(10):3679-3688. doi: 10.1044/2019_JSLHR-S-19-0167. Epub 2019 Oct 2.

DOI:10.1044/2019_JSLHR-S-19-0167
PMID:31577522
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7201334/
Abstract

Purpose This study aimed to test whether (and how) somatosensory feedback signals from the vocal tract affect concurrent unimodal visual speech perception. Method Participants discriminated pairs of silent visual utterances of vowels under 3 experimental conditions: (a) normal (baseline) and while holding either (b) a bite block or (c) a lip tube in their mouths. To test the specificity of somatosensory-visual interactions during perception, we assessed discrimination of vowel contrasts optically distinguished based on their mandibular (English /ɛ/-/æ/) or labial (English /u/-French /u/) postures. In addition, we assessed perception of each contrast using dynamically articulating videos and static (single-frame) images of each gesture (at vowel midpoint). Results Engaging the jaw selectively facilitated perception of the dynamic gestures optically distinct in terms of jaw height, whereas engaging the lips selectively facilitated perception of the dynamic gestures optically distinct in terms of their degree of lip compression and protrusion. Thus, participants perceived visible speech movements in relation to the configuration and shape of their own vocal tract (and possibly their ability to produce covert vowel production-like movements). In contrast, engaging the articulators had no effect when the speaking faces did not move, suggesting that the somatosensory inputs affected perception of time-varying kinematic information rather than changes in target (movement end point) mouth shapes. Conclusions These findings suggest that orofacial somatosensory inputs associated with speech production prime premotor and somatosensory brain regions involved in the sensorimotor control of speech, thereby facilitating perception of concordant visible speech movements. Supplemental Material https://doi.org/10.23641/asha.9911846.

摘要

目的 本研究旨在测试来自声道的躯体感觉反馈信号是否(以及如何)影响同时进行的单模态视觉言语感知。 方法 参与者在 3 种实验条件下辨别元音的无声视觉对:(a)正常(基线)和(b)用咬块或(c)用唇管咬住嘴。为了测试感知过程中躯体感觉-视觉相互作用的特异性,我们评估了基于下颌(英语 /ɛ/-/æ/)或唇(英语 /u/-法语 /u/)姿势光学区分的元音对比的辨别力。此外,我们使用动态发音视频和每个姿势的静态(单帧)图像(在元音中点)评估了每个对比的感知。 结果 选择性地使下巴运动选择性地促进了根据下巴高度光学区分的动态姿势的感知,而选择性地使嘴唇运动选择性地促进了根据嘴唇压缩和突出程度光学区分的动态姿势的感知。因此,参与者根据自己声道的配置和形状感知可见言语运动(可能还包括他们产生隐蔽元音产生样运动的能力)。相比之下,当说话人脸不移动时,参与发音器官的运动没有影响,这表明躯体感觉输入影响时间变化的运动学信息的感知,而不是目标(运动终点)口形的变化。 结论 这些发现表明,与言语产生相关的口面部躯体感觉输入使参与言语运动感觉运动控制的运动前和躯体感觉脑区被激活,从而促进了对一致的可见言语运动的感知。 补充材料 https://doi.org/10.23641/asha.9911846.