Department of Linguistics, University of British Columbia, Vancouver, British Columbia V6T 1Z4, Canada.
Nature. 2009 Nov 26;462(7272):502-4. doi: 10.1038/nature08572.
Visual information from a speaker's face can enhance or interfere with accurate auditory perception. This integration of information across auditory and visual streams has been observed in functional imaging studies, and has typically been attributed to the frequency and robustness with which perceivers jointly encounter event-specific information from these two modalities. Adding the tactile modality has long been considered a crucial next step in understanding multisensory integration. However, previous studies have found an influence of tactile input on speech perception only under limited circumstances, either where perceivers were aware of the task or where they had received training to establish a cross-modal mapping. Here we show that perceivers integrate naturalistic tactile information during auditory speech perception without previous training. Drawing on the observation that some speech sounds produce tiny bursts of aspiration (such as English 'p'), we applied slight, inaudible air puffs on participants' skin at one of two locations: the right hand or the neck. Syllables heard simultaneously with cutaneous air puffs were more likely to be heard as aspirated (for example, causing participants to mishear 'b' as 'p'). These results demonstrate that perceivers integrate event-relevant tactile information in auditory perception in much the same way as they do visual information.
来自说话者面部的视觉信息可以增强或干扰准确的听觉感知。这种跨听觉和视觉流的信息整合在功能成像研究中已经被观察到,通常归因于感知者共同遇到来自这两种模态的特定事件信息的频率和稳健性。长期以来,添加触觉模态一直被认为是理解多感官整合的关键下一步。然而,以前的研究发现,只有在感知者意识到任务或接受过建立跨模态映射的培训的有限情况下,触觉输入才会对语音感知产生影响。在这里,我们表明,感知者在没有先前训练的情况下,在听觉言语感知中整合自然的触觉信息。根据一些语音会产生微小的吸气爆发(如英语中的“p”)的观察结果,我们在两个位置之一(右手或颈部)上向参与者的皮肤施加轻微的、听不见的空气吹气。与皮肤空气吹气同时听到的音节更有可能被听到是吸气的(例如,导致参与者将“b”误听为“p”)。这些结果表明,感知者以与视觉信息相同的方式在听觉感知中整合与事件相关的触觉信息。