Nasir Sazzad M, Ostry David J
Department of Psychology, McGill University, 1205 Dr. Penfield Avenue, Montreal, Quebec H3A1B1, Canada.
Nat Neurosci. 2008 Oct;11(10):1217-22. doi: 10.1038/nn.2193. Epub 2008 Sep 14.
Speech production, like other sensorimotor behaviors, relies on multiple sensory inputs--audition, proprioceptive inputs from muscle spindles and cutaneous inputs from mechanoreceptors in the skin and soft tissues of the vocal tract. However, the capacity for intelligible speech by deaf speakers suggests that somatosensory input alone may contribute to speech motor control and perhaps even to speech learning. We assessed speech motor learning in cochlear implant recipients who were tested with their implants turned off. A robotic device was used to alter somatosensory feedback by displacing the jaw during speech. We found that implant subjects progressively adapted to the mechanical perturbation with training. Moreover, the corrections that we observed were for movement deviations that were exceedingly small, on the order of millimeters, indicating that speakers have precise somatosensory expectations. Speech motor learning is substantially dependent on somatosensory input.
语音产生与其他感觉运动行为一样,依赖于多种感觉输入——听觉、来自肌梭的本体感觉输入以及来自声道皮肤和软组织中机械感受器的皮肤感觉输入。然而,失聪者能够发出可理解语音这一能力表明,仅体感输入可能有助于语音运动控制,甚至可能有助于语音学习。我们对接受人工耳蜗植入的患者在关闭植入设备的情况下进行了语音运动学习评估。使用了一个机器人装置,在患者说话时通过移动其下巴来改变体感反馈。我们发现,植入人工耳蜗的受试者通过训练逐渐适应了机械干扰。此外,我们观察到的校正针对的是极其微小的运动偏差,幅度在毫米量级,这表明说话者有精确的体感预期。语音运动学习在很大程度上依赖于体感输入。