Suppr超能文献

基于对象的听觉促进对具有频繁和罕见目标的图片及单词的视觉搜索。

Object-based auditory facilitation of visual search for pictures and words with frequent and rare targets.

作者信息

Iordanescu Lucica, Grabowecky Marcia, Suzuki Satoru

机构信息

Department of Psychology, Northwestern University, 2029 Sheridan Road, Evanston, IL 60208, United States.

出版信息

Acta Psychol (Amst). 2011 Jun;137(2):252-9. doi: 10.1016/j.actpsy.2010.07.017. Epub 2010 Sep 22.

Abstract

Auditory and visual processes demonstrably enhance each other based on spatial and temporal coincidence. Our recent results on visual search have shown that auditory signals also enhance visual salience of specific objects based on multimodal experience. For example, we tend to see an object (e.g., a cat) and simultaneously hear its characteristic sound (e.g., "meow"), to name an object when we see it, and to vocalize a word when we read it, but we do not tend to see a word (e.g., cat) and simultaneously hear the characteristic sound (e.g., "meow") of the named object. If auditory-visual enhancements occur based on this pattern of experiential associations, playing a characteristic sound (e.g., "meow") should facilitate visual search for the corresponding object (e.g., an image of a cat), hearing a name should facilitate visual search for both the corresponding object and corresponding word, but playing a characteristic sound should not facilitate visual search for the name of the corresponding object. Our present and prior results together confirmed these experiential association predictions. We also recently showed that the underlying object-based auditory-visual interactions occur rapidly (within 220ms) and guide initial saccades towards target objects. If object-based auditory-visual enhancements are automatic and persistent, an interesting application would be to use characteristic sounds to facilitate visual search when targets are rare, such as during baggage screening. Our participants searched for a gun among other objects when a gun was presented on only 10% of the trials. The search time was speeded when a gun sound was played on every trial (primarily on gun-absent trials); importantly, playing gun sounds facilitated both gun-present and gun-absent responses, suggesting that object-based auditory-visual enhancements persistently increase the detectability of guns rather than simply biasing gun-present responses. Thus, object-based auditory-visual interactions that derive from experiential associations rapidly and persistently increase visual salience of corresponding objects.

摘要

基于空间和时间上的重合,听觉和视觉过程显然会相互增强。我们最近关于视觉搜索的研究结果表明,基于多模态体验,听觉信号也会增强特定物体的视觉显著性。例如,我们倾向于看到一个物体(如一只猫)并同时听到其特征声音(如“喵”),看到物体时说出它的名字,阅读单词时发出单词的读音,但我们不太可能看到一个单词(如“猫”)并同时听到所命名物体的特征声音(如“喵”)。如果基于这种体验关联模式发生视听增强,播放特征声音(如“喵”)应该会促进对相应物体(如猫的图像)的视觉搜索,听到物体的名字应该会促进对相应物体和相应单词的视觉搜索,但播放特征声音不应该促进对相应物体名称的视觉搜索。我们目前和之前的研究结果共同证实了这些体验关联预测。我们最近还表明,基于物体的视听交互作用迅速发生(在220毫秒内),并引导初始扫视朝向目标物体。如果基于物体的视听增强是自动且持久的,一个有趣的应用是在目标稀少时,如行李安检期间,使用特征声音来促进视觉搜索。当在仅10%的试验中呈现枪时,我们的参与者在其他物体中搜索枪。当每次试验都播放枪声时(主要是在没有枪的试验中),搜索时间加快;重要的是,播放枪声促进了有枪和无枪情况下的反应,这表明基于物体的视听增强持续提高了枪的可检测性,而不仅仅是偏向有枪情况下的反应。因此,源自体验关联的基于物体的视听交互作用迅速且持久地增加了相应物体的视觉显著性。

相似文献

1
Object-based auditory facilitation of visual search for pictures and words with frequent and rare targets.
Acta Psychol (Amst). 2011 Jun;137(2):252-9. doi: 10.1016/j.actpsy.2010.07.017. Epub 2010 Sep 22.
2
Characteristic sounds make you look at target objects more quickly.
Atten Percept Psychophys. 2010 Oct;72(7):1736-41. doi: 10.3758/APP.72.7.1736.
3
Characteristic sounds facilitate visual search.
Psychon Bull Rev. 2008 Jun;15(3):548-54. doi: 10.3758/pbr.15.3.548.
4
Auditory and Semantic Cues Facilitate Decoding of Visual Object Category in MEG.
Cereb Cortex. 2020 Mar 21;30(2):597-606. doi: 10.1093/cercor/bhz110.
7
What You See Is What You Hear: Sounds Alter the Contents of Visual Perception.
Psychol Sci. 2022 Dec;33(12):2109-2122. doi: 10.1177/09567976221121348. Epub 2022 Sep 30.
8
Sound effects: Multimodal input helps infants find displaced objects.
Br J Dev Psychol. 2017 Sep;35(3):317-333. doi: 10.1111/bjdp.12165. Epub 2016 Nov 21.
9
Interacting parallel pathways associate sounds with visual identity in auditory cortices.
Neuroimage. 2016 Jan 1;124(Pt A):858-868. doi: 10.1016/j.neuroimage.2015.09.044. Epub 2015 Sep 28.
10
See what I hear? Beat perception in auditory and visual rhythms.
Exp Brain Res. 2012 Jul;220(1):51-61. doi: 10.1007/s00221-012-3114-8. Epub 2012 May 24.

引用本文的文献

1
Interference from semantically distracting sounds in action scene search.
Atten Percept Psychophys. 2025 Feb;87(2):498-510. doi: 10.3758/s13414-025-03023-8. Epub 2025 Feb 6.
2
The eyes speak when the mouth cannot: Using eye movements to interpret omissions in primary progressive aphasia.
Neuropsychologia. 2023 Jun 6;184:108530. doi: 10.1016/j.neuropsychologia.2023.108530. Epub 2023 Mar 9.
3
Online mouse cursor trajectories distinguish phonological activation by linguistic and nonlinguistic sounds.
Psychon Bull Rev. 2023 Feb;30(1):362-372. doi: 10.3758/s13423-022-02153-6. Epub 2022 Jul 26.
4
Unequal allocation of overt and covert attention in Multiple Object Tracking.
Atten Percept Psychophys. 2022 Jul;84(5):1519-1537. doi: 10.3758/s13414-022-02501-7. Epub 2022 May 13.
6
Memory after visual search: Overlapping phonology, shared meaning, and bilingual experience influence what we remember.
Brain Lang. 2021 Nov;222:105012. doi: 10.1016/j.bandl.2021.105012. Epub 2021 Aug 28.
7
Cross-Modal Interaction Between Auditory and Visual Input Impacts Memory Retrieval.
Front Neurosci. 2021 Jul 26;15:661477. doi: 10.3389/fnins.2021.661477. eCollection 2021.
8
Preverbal infants utilize cross-modal semantic congruency in artificial grammar acquisition.
Sci Rep. 2018 Aug 23;8(1):12707. doi: 10.1038/s41598-018-30927-3.

本文引用的文献

1
2
Attention and the crossmodal construction of space.
Trends Cogn Sci. 1998 Jul 1;2(7):254-62. doi: 10.1016/S1364-6613(98)01188-7.
3
Characteristic sounds make you look at target objects more quickly.
Atten Percept Psychophys. 2010 Oct;72(7):1736-41. doi: 10.3758/APP.72.7.1736.
4
Even in correctable search, some types of rare targets are frequently missed.
Atten Percept Psychophys. 2009 Apr;71(3):541-53. doi: 10.3758/APP.71.3.541.
5
Pip and pop: nonspatial auditory signals improve spatial visual search.
J Exp Psychol Hum Percept Perform. 2008 Oct;34(5):1053-65. doi: 10.1037/0096-1523.34.5.1053.
6
Vigilance requires hard mental work and is stressful.
Hum Factors. 2008 Jun;50(3):433-41. doi: 10.1518/001872008X312152.
7
Characteristic sounds facilitate visual search.
Psychon Bull Rev. 2008 Jun;15(3):548-54. doi: 10.3758/pbr.15.3.548.
8
Self-awareness affects vision.
Curr Biol. 2008 May 20;18(10):R414-R415. doi: 10.1016/j.cub.2008.03.009.
9
Low target prevalence is a stubborn source of errors in visual search tasks.
J Exp Psychol Gen. 2007 Nov;136(4):623-38. doi: 10.1037/0096-3445.136.4.623.
10
Auditory-visual crossmodal integration in perception of face gender.
Curr Biol. 2007 Oct 9;17(19):1680-5. doi: 10.1016/j.cub.2007.08.043. Epub 2007 Sep 6.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验