文献检索文档翻译深度研究
Suppr Zotero 插件Zotero 插件
邀请有礼套餐&价格历史记录

新学期,新优惠

限时优惠:9月1日-9月22日

30天高级会员仅需29元

1天体验卡首发特惠仅需5.99元

了解详情
不再提醒
插件&应用
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
高级版
套餐订阅购买积分包
AI 工具
文献检索文档翻译深度研究
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2025

Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?

作者信息

Wahn Basil, König Peter

机构信息

Institute of Cognitive Science, Universität Osnabrück, Osnabrück, Germany.

Institut für Neurophysiologie und Pathophysiologie, Universitätsklinikum Hamburg-Eppendorf, Hamburg, Germany.

出版信息

Adv Cogn Psychol. 2017 Mar 31;13(1):83-96. doi: 10.5709/acp-0209-2. eCollection 2017.


DOI:10.5709/acp-0209-2
PMID:28450975
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5405449/
Abstract

Human information processing is limited by attentional resources. That is, via attentional mechanisms, humans select a limited amount of sensory input to process while other sensory input is neglected. In multisensory research, a matter of ongoing debate is whether there are distinct pools of attentional resources for each sensory modality or whether attentional resources are shared across sensory modalities. Recent studies have suggested that attentional resource allocation across sensory modalities is in part task-dependent. That is, the recruitment of attentional resources across the sensory modalities depends on whether processing involves (e.g., the discrimination of stimulus attributes) or (e.g., the localization of stimuli). In the present paper, we review findings in multisensory research related to this view. For the visual and auditory sensory modalities, findings suggest that distinct resources are recruited when humans perform object-based attention tasks, whereas for the visual and tactile sensory modalities, partially shared resources are recruited. If object-based attention tasks are time-critical, shared resources are recruited across the sensory modalities. When humans perform an object-based attention task in combination with a spatial attention task, partly shared resources are recruited across the sensory modalities as well. Conversely, for spatial attention tasks, attentional processing does consistently involve shared attentional resources for the sensory modalities. Generally, findings suggest that the attentional system flexibly allocates attentional resources depending on task demands. We propose that such flexibility reflects a large-scale optimization strategy that minimizes the brain's costly resource expenditures and simultaneously maximizes capability to process currently relevant information.

摘要
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/78bd/5405449/7c2f257ef622/acp-13-083-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/78bd/5405449/7c2f257ef622/acp-13-083-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/78bd/5405449/7c2f257ef622/acp-13-083-g001.jpg

相似文献

[1]
Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?

Adv Cogn Psychol. 2017-3-31

[2]
Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources.

Front Integr Neurosci. 2016-3-8

[3]
Auditory Stimulus Detection Partially Depends on Visuospatial Attentional Resources.

Iperception. 2017-1-1

[4]
Shared or Distinct Attentional Resources? Confounds in Dual Task Designs, Countermeasures, and Guidelines.

Multisens Res. 2019-1-1

[5]
Vision and Haptics Share Spatial Attentional Resources and Visuotactile Integration Is Not Affected by High Attentional Load.

Multisens Res. 2015

[6]
Audition and vision share spatial attentional resources, yet attentional load does not disrupt audiovisual integration.

Front Psychol. 2015-7-29

[7]
Can Limitations of Visuospatial Attention Be Circumvented? A Review.

Front Psychol. 2017-10-27

[8]
Object-based attention is multisensory: co-activation of an object's representations in ignored sensory modalities.

Eur J Neurosci. 2007-7

[9]
Cross-Modal Attention Effects in the Vestibular Cortex during Attentive Tracking of Moving Objects.

J Neurosci. 2016-12-14

[10]
Interactions between voluntary and stimulus-driven spatial attention mechanisms across sensory modalities.

J Cogn Neurosci. 2009-12

引用本文的文献

[1]
Effects of Virtual Reality-Based Interventions on Preoperative Anxiety in Patients Undergoing Elective Surgery With Anesthesia: Systematic Review and Meta-Analysis.

J Med Internet Res. 2025-4-30

[2]
Neurophysiology of ACL Injury.

Orthop Rev (Pavia). 2025-2-19

[3]
Electrotactile proprioception training improves finger control accuracy and potential mechanism is proprioceptive recalibration.

Sci Rep. 2024-11-4

[4]
EEG β oscillations in aberrant data perception under cognitive load modulation.

Sci Rep. 2024-10-3

[5]
Association of Auditory Interference and Ocular-Motor Response with Subconcussive Head Impacts in Adolescent Football Players.

Neurotrauma Rep. 2024-5-31

[6]
The effect of virtual reality versus standard-of-care treatment on pain perception during paediatric vaccination: A randomised controlled trial.

J Clin Nurs. 2025-3

[7]
A bonus task boosts people's willingness to offload cognition to an algorithm.

Cogn Res Princ Implic. 2024-4-23

[8]
Auditory Attentional Load Modulates Audiovisual Integration During Auditory/Visual Discrimination.

Adv Cogn Psychol. 2021-7-25

[9]
Perception and deception: Exploring individual responses to deepfakes across different modalities.

Heliyon. 2023-9-21

[10]
Interoceptive abilities facilitate taking another's spatial perspective.

Sci Rep. 2023-6-21

本文引用的文献

[1]
Pupil Sizes Scale with Attentional Load and Task Experience in a Multiple Object Tracking Task.

PLoS One. 2016-12-15

[2]
Learning New Sensorimotor Contingencies: Effects of Long-Term Use of Sensory Augmentation on the Brain and Conscious Perception.

PLoS One. 2016-12-13

[3]
Bayesian Alternation during Tactile Augmentation.

Front Behav Neurosci. 2016-10-7

[4]
Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources.

Front Integr Neurosci. 2016-3-8

[5]
The COGs (context, object, and goals) in multisensory processing.

Exp Brain Res. 2016-5

[6]
Existence of competing modality dominances.

Atten Percept Psychophys. 2016-5

[7]
Inattentional Deafness: Visual Load Leads to Time-Specific Suppression of Auditory Evoked Responses.

J Neurosci. 2015-12-9

[8]
Multisensory teamwork: using a tactile or an auditory display to exchange gaze information improves performance in joint visual search.

Ergonomics. 2016-6

[9]
The interactions of multisensory integration with endogenous and exogenous attention.

Neurosci Biobehav Rev. 2016-2

[10]
Sustained attention and prediction: distinct brain maturation trajectories during adolescence.

Front Hum Neurosci. 2015-9-24

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

推荐工具

医学文档翻译智能文献检索