Ciaramitaro Vivian M, Chow Hiu Mei, Eglington Luke G
Department of Psychology, Developmental and Brain Sciences, University of Massachusetts, Boston, MA,
Department of Psychology, Developmental and Brain Sciences, University of Massachusetts, Boston, MA, USADepartment of Psychological and Brain Sciences, Dartmouth College, Hanover, NH,
J Vis. 2017 Mar 1;17(3):20. doi: 10.1167/17.3.20.
We used a cross-modal dual task to examine how changing visual-task demands influenced auditory processing, namely auditory thresholds for amplitude- and frequency-modulated sounds. Observers had to attend to two consecutive intervals of sounds and report which interval contained the auditory stimulus that was modulated in amplitude (Experiment 1) or frequency (Experiment 2). During auditory-stimulus presentation, observers simultaneously attended to a rapid sequential visual presentation-two consecutive intervals of streams of visual letters-and had to report which interval contained a particular color (low load, demanding less attentional resources) or, in separate blocks of trials, which interval contained more of a target letter (high load, demanding more attentional resources). We hypothesized that if attention is a shared resource across vision and audition, an easier visual task should free up more attentional resources for auditory processing on an unrelated task, hence improving auditory thresholds. Auditory detection thresholds were lower-that is, auditory sensitivity was improved-for both amplitude- and frequency-modulated sounds when observers engaged in a less demanding (compared to a more demanding) visual task. In accord with previous work, our findings suggest that visual-task demands can influence the processing of auditory information on an unrelated concurrent task, providing support for shared attentional resources. More importantly, our results suggest that attending to information in a different modality, cross-modal attention, can influence basic auditory contrast sensitivity functions, highlighting potential similarities between basic mechanisms for visual and auditory attention.
我们使用了一种跨模态双任务来研究视觉任务需求的变化如何影响听觉处理,即调幅和调频声音的听觉阈值。观察者必须注意两个连续的声音间隔,并报告哪个间隔包含了在幅度(实验1)或频率(实验2)上调制的听觉刺激。在听觉刺激呈现期间,观察者同时注意快速连续的视觉呈现——两个连续的视觉字母流间隔——并必须报告哪个间隔包含特定颜色(低负荷,需要较少注意力资源),或者在单独的试验块中,哪个间隔包含更多目标字母(高负荷,需要更多注意力资源)。我们假设,如果注意力是视觉和听觉共享的资源,一个较简单的视觉任务应该会释放更多注意力资源用于无关任务的听觉处理,从而提高听觉阈值。当观察者进行要求较低(与要求较高相比)的视觉任务时,调幅和调频声音的听觉检测阈值都更低,也就是说听觉敏感性得到了提高。与之前的研究一致,我们的研究结果表明,视觉任务需求可以影响无关并行任务中的听觉信息处理,为共享注意力资源提供了支持。更重要的是,我们的结果表明,关注不同模态中的信息,即跨模态注意力,可以影响基本的听觉对比敏感度函数,突出了视觉和听觉注意力基本机制之间的潜在相似性。