• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

跨感觉通道的注意力资源分配是否依赖于任务?

Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?

作者信息

Wahn Basil, König Peter

机构信息

Institute of Cognitive Science, Universität Osnabrück, Osnabrück, Germany.

Institut für Neurophysiologie und Pathophysiologie, Universitätsklinikum Hamburg-Eppendorf, Hamburg, Germany.

出版信息

Adv Cogn Psychol. 2017 Mar 31;13(1):83-96. doi: 10.5709/acp-0209-2. eCollection 2017.

DOI:10.5709/acp-0209-2
PMID:28450975
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5405449/
Abstract

Human information processing is limited by attentional resources. That is, via attentional mechanisms, humans select a limited amount of sensory input to process while other sensory input is neglected. In multisensory research, a matter of ongoing debate is whether there are distinct pools of attentional resources for each sensory modality or whether attentional resources are shared across sensory modalities. Recent studies have suggested that attentional resource allocation across sensory modalities is in part task-dependent. That is, the recruitment of attentional resources across the sensory modalities depends on whether processing involves (e.g., the discrimination of stimulus attributes) or (e.g., the localization of stimuli). In the present paper, we review findings in multisensory research related to this view. For the visual and auditory sensory modalities, findings suggest that distinct resources are recruited when humans perform object-based attention tasks, whereas for the visual and tactile sensory modalities, partially shared resources are recruited. If object-based attention tasks are time-critical, shared resources are recruited across the sensory modalities. When humans perform an object-based attention task in combination with a spatial attention task, partly shared resources are recruited across the sensory modalities as well. Conversely, for spatial attention tasks, attentional processing does consistently involve shared attentional resources for the sensory modalities. Generally, findings suggest that the attentional system flexibly allocates attentional resources depending on task demands. We propose that such flexibility reflects a large-scale optimization strategy that minimizes the brain's costly resource expenditures and simultaneously maximizes capability to process currently relevant information.

摘要

人类信息处理受到注意力资源的限制。也就是说,通过注意力机制,人类选择有限数量的感官输入进行处理,而忽略其他感官输入。在多感官研究中,一个持续争论的问题是,每种感官模态是否存在不同的注意力资源池,或者注意力资源是否在不同感官模态之间共享。最近的研究表明,跨感官模态的注意力资源分配部分取决于任务。也就是说,跨感官模态的注意力资源募集取决于处理过程是涉及(例如,刺激属性的辨别)还是(例如,刺激的定位)。在本文中,我们回顾了多感官研究中与这一观点相关的发现。对于视觉和听觉感官模态,研究结果表明,当人类执行基于对象的注意力任务时,会募集不同的资源,而对于视觉和触觉感官模态,则会募集部分共享的资源。如果基于对象的注意力任务对时间要求很高,则会在不同感官模态之间募集共享资源。当人类将基于对象的注意力任务与空间注意力任务结合执行时,也会在不同感官模态之间募集部分共享的资源。相反,对于空间注意力任务,注意力处理始终涉及不同感官模态的共享注意力资源。一般来说,研究结果表明,注意力系统会根据任务需求灵活分配注意力资源。我们提出,这种灵活性反映了一种大规模的优化策略,该策略既能将大脑昂贵的资源消耗降至最低,又能同时最大限度地提高处理当前相关信息的能力。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/78bd/5405449/7c2f257ef622/acp-13-083-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/78bd/5405449/7c2f257ef622/acp-13-083-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/78bd/5405449/7c2f257ef622/acp-13-083-g001.jpg

相似文献

1
Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?跨感觉通道的注意力资源分配是否依赖于任务?
Adv Cogn Psychol. 2017 Mar 31;13(1):83-96. doi: 10.5709/acp-0209-2. eCollection 2017.
2
Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources.视觉触觉加工中的注意力资源分配取决于任务,但最佳视觉触觉整合并不依赖于注意力资源。
Front Integr Neurosci. 2016 Mar 8;10:13. doi: 10.3389/fnint.2016.00013. eCollection 2016.
3
Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources.听觉刺激检测部分依赖于视觉空间注意资源。
Iperception. 2017 Jan 1;8(1):2041669516688026. doi: 10.1177/2041669516688026. eCollection 2017 Jan-Feb.
4
Shared or Distinct Attentional Resources? Confounds in Dual Task Designs, Countermeasures, and Guidelines.共享还是不同的注意力资源?双重任务设计中的混淆因素、对策及指南
Multisens Res. 2019 Jan 1;32(2):145-163. doi: 10.1163/22134808-20181328.
5
Vision and Haptics Share Spatial Attentional Resources and Visuotactile Integration Is Not Affected by High Attentional Load.视觉和触觉共享空间注意力资源,且视触觉整合不受高注意力负荷的影响。
Multisens Res. 2015;28(3-4):371-92. doi: 10.1163/22134808-00002482.
6
Audition and vision share spatial attentional resources, yet attentional load does not disrupt audiovisual integration.听觉和视觉共享空间注意力资源,但注意力负荷并不会干扰视听整合。
Front Psychol. 2015 Jul 29;6:1084. doi: 10.3389/fpsyg.2015.01084. eCollection 2015.
7
Can Limitations of Visuospatial Attention Be Circumvented? A Review.视觉空间注意力的局限性能否被克服?综述
Front Psychol. 2017 Oct 27;8:1896. doi: 10.3389/fpsyg.2017.01896. eCollection 2017.
8
Object-based attention is multisensory: co-activation of an object's representations in ignored sensory modalities.基于客体的注意是多感官的:在被忽视的感觉通道中共同激活客体的表征。
Eur J Neurosci. 2007 Jul;26(2):499-509. doi: 10.1111/j.1460-9568.2007.05668.x.
9
Cross-Modal Attention Effects in the Vestibular Cortex during Attentive Tracking of Moving Objects.在对移动物体进行注意力跟踪时前庭皮质中的跨模态注意力效应。
J Neurosci. 2016 Dec 14;36(50):12720-12728. doi: 10.1523/JNEUROSCI.2480-16.2016. Epub 2016 Nov 7.
10
Interactions between voluntary and stimulus-driven spatial attention mechanisms across sensory modalities.跨感觉模式的自愿和刺激驱动的空间注意机制之间的相互作用。
J Cogn Neurosci. 2009 Dec;21(12):2384-97. doi: 10.1162/jocn.2008.21178.

引用本文的文献

1
Effects of Virtual Reality-Based Interventions on Preoperative Anxiety in Patients Undergoing Elective Surgery With Anesthesia: Systematic Review and Meta-Analysis.基于虚拟现实的干预措施对接受择期麻醉手术患者术前焦虑的影响:系统评价与荟萃分析
J Med Internet Res. 2025 Apr 30;27:e55291. doi: 10.2196/55291.
2
Neurophysiology of ACL Injury.前交叉韧带损伤的神经生理学
Orthop Rev (Pavia). 2025 Feb 19;17:129173. doi: 10.52965/001c.129173. eCollection 2025.
3
Electrotactile proprioception training improves finger control accuracy and potential mechanism is proprioceptive recalibration.

本文引用的文献

1
Pupil Sizes Scale with Attentional Load and Task Experience in a Multiple Object Tracking Task.在多目标跟踪任务中,瞳孔大小随注意力负荷和任务经验而变化。
PLoS One. 2016 Dec 15;11(12):e0168087. doi: 10.1371/journal.pone.0168087. eCollection 2016.
2
Learning New Sensorimotor Contingencies: Effects of Long-Term Use of Sensory Augmentation on the Brain and Conscious Perception.学习新的感觉运动偶联:长期使用感觉增强对大脑和意识感知的影响。
PLoS One. 2016 Dec 13;11(12):e0166647. doi: 10.1371/journal.pone.0166647. eCollection 2016.
3
Bayesian Alternation during Tactile Augmentation.
电触觉本体感觉训练可提高手指控制精度,其潜在机制是本体感觉再校准。
Sci Rep. 2024 Nov 4;14(1):26568. doi: 10.1038/s41598-024-78063-5.
4
EEG β oscillations in aberrant data perception under cognitive load modulation.认知负荷调节下异常数据感知中的 EEGβ 振荡。
Sci Rep. 2024 Oct 3;14(1):22995. doi: 10.1038/s41598-024-74381-w.
5
Association of Auditory Interference and Ocular-Motor Response with Subconcussive Head Impacts in Adolescent Football Players.青少年足球运动员中听觉干扰和眼动反应与次脑震荡头部撞击的关联
Neurotrauma Rep. 2024 May 31;5(1):512-521. doi: 10.1089/neur.2023.0125. eCollection 2024.
6
The effect of virtual reality versus standard-of-care treatment on pain perception during paediatric vaccination: A randomised controlled trial.虚拟现实与标准护理治疗对儿童接种疫苗期间疼痛感知的影响:一项随机对照试验。
J Clin Nurs. 2025 Mar;34(3):1045-1062. doi: 10.1111/jocn.17287. Epub 2024 Jun 14.
7
A bonus task boosts people's willingness to offload cognition to an algorithm.一项奖励任务会增强人们将认知工作交给算法的意愿。
Cogn Res Princ Implic. 2024 Apr 23;9(1):24. doi: 10.1186/s41235-024-00550-0.
8
Auditory Attentional Load Modulates Audiovisual Integration During Auditory/Visual Discrimination.听觉注意负荷在听觉/视觉辨别过程中调节视听整合。
Adv Cogn Psychol. 2021 Jul 25;17(3):193-202. doi: 10.5709/acp-0328-0. eCollection 2021.
9
Perception and deception: Exploring individual responses to deepfakes across different modalities.感知与欺骗:探究个体对不同模态下深度伪造技术的反应。
Heliyon. 2023 Sep 21;9(10):e20383. doi: 10.1016/j.heliyon.2023.e20383. eCollection 2023 Oct.
10
Interoceptive abilities facilitate taking another's spatial perspective.内感受能力有助于站在他人的角度看问题。
Sci Rep. 2023 Jun 21;13(1):10064. doi: 10.1038/s41598-023-36173-6.
触觉增强过程中的贝叶斯交替
Front Behav Neurosci. 2016 Oct 7;10:187. doi: 10.3389/fnbeh.2016.00187. eCollection 2016.
4
Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources.视觉触觉加工中的注意力资源分配取决于任务,但最佳视觉触觉整合并不依赖于注意力资源。
Front Integr Neurosci. 2016 Mar 8;10:13. doi: 10.3389/fnint.2016.00013. eCollection 2016.
5
The COGs (context, object, and goals) in multisensory processing.多感官处理中的COGs(背景、对象和目标)。
Exp Brain Res. 2016 May;234(5):1307-23. doi: 10.1007/s00221-016-4590-z. Epub 2016 Mar 1.
6
Existence of competing modality dominances.存在相互竞争的模态优势。
Atten Percept Psychophys. 2016 May;78(4):1104-14. doi: 10.3758/s13414-016-1061-3.
7
Inattentional Deafness: Visual Load Leads to Time-Specific Suppression of Auditory Evoked Responses.非注意性失聪:视觉负荷导致听觉诱发电位的特定时间抑制。
J Neurosci. 2015 Dec 9;35(49):16046-54. doi: 10.1523/JNEUROSCI.2931-15.2015.
8
Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search.多感官协作:使用触觉或听觉显示器来交换注视信息可提高联合视觉搜索的表现。
Ergonomics. 2016 Jun;59(6):781-95. doi: 10.1080/00140139.2015.1099742. Epub 2015 Nov 20.
9
The interactions of multisensory integration with endogenous and exogenous attention.多感官整合与内源性和外源性注意力的相互作用。
Neurosci Biobehav Rev. 2016 Feb;61:208-24. doi: 10.1016/j.neubiorev.2015.11.002. Epub 2015 Nov 10.
10
Sustained attention and prediction: distinct brain maturation trajectories during adolescence.持续注意力与预测:青春期不同的大脑成熟轨迹
Front Hum Neurosci. 2015 Sep 24;9:519. doi: 10.3389/fnhum.2015.00519. eCollection 2015.