• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

搜索者调整其眼球运动动态以适应自然场景中的目标特征。

Searchers adjust their eye-movement dynamics to target characteristics in natural scenes.

机构信息

Department of Psychology, University of Potsdam, Karl-Liebknechtstraße 24/25, 14476, Potsdam, Germany.

Neural Information Processing Group, University of Tübingen, Sand 6, 72076, Tübingen, Germany.

出版信息

Sci Rep. 2019 Feb 7;9(1):1635. doi: 10.1038/s41598-018-37548-w.

DOI:10.1038/s41598-018-37548-w
PMID:30733470
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6367441/
Abstract

When searching a target in a natural scene, it has been shown that both the target's visual properties and similarity to the background influence whether and how fast humans are able to find it. So far, it was unclear whether searchers adjust the dynamics of their eye movements (e.g., fixation durations, saccade amplitudes) to the target they search for. In our experiment, participants searched natural scenes for six artificial targets with different spatial frequency content throughout eight consecutive sessions. High-spatial frequency targets led to smaller saccade amplitudes and shorter fixation durations than low-spatial frequency targets if target identity was known. If a saccade was programmed in the same direction as the previous saccade, fixation durations and successive saccade amplitudes were not influenced by target type. Visual saliency and empirical fixation density at the endpoints of saccades which maintain direction were comparatively low, indicating that these saccades were less selective. Our results suggest that searchers adjust their eye movement dynamics to the search target efficiently, since previous research has shown that low-spatial frequencies are visible farther into the periphery than high-spatial frequencies. We interpret the saccade direction specificity of our effects as an underlying separation into a default scanning mechanism and a selective, target-dependent mechanism.

摘要

当在自然场景中搜索目标时,已经表明目标的视觉属性和与背景的相似性都会影响人类是否以及多快能够找到目标。到目前为止,还不清楚搜索者是否会根据他们要搜索的目标来调整他们的眼球运动(例如注视持续时间、眼跳幅度)的动态。在我们的实验中,参与者在八个连续的会话中搜索了具有不同空间频率内容的六个人工目标的自然场景。如果已知目标身份,高空间频率目标比低空间频率目标导致更小的眼跳幅度和更短的注视持续时间。如果一个眼跳与前一个眼跳在相同的方向上编程,那么注视持续时间和连续的眼跳幅度不会受到目标类型的影响。保持方向的眼跳端点处的视觉显着性和经验性注视密度相对较低,这表明这些眼跳的选择性较低。我们的结果表明,搜索者可以有效地调整他们的眼球运动动态来搜索目标,因为先前的研究表明,低空间频率比高空间频率在更远的外围处可见。我们将我们的效果的眼跳方向特异性解释为默认扫描机制和选择性、目标依赖机制的潜在分离。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/bfbb15fc1974/41598_2018_37548_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/12d8fae72d35/41598_2018_37548_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/26bcab11be75/41598_2018_37548_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/aac25591e869/41598_2018_37548_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/d120abc91741/41598_2018_37548_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/a4430ab8c4ba/41598_2018_37548_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/42086a444f72/41598_2018_37548_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/12d9ba9c35cd/41598_2018_37548_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/9ecaba816495/41598_2018_37548_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/bfbb15fc1974/41598_2018_37548_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/12d8fae72d35/41598_2018_37548_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/26bcab11be75/41598_2018_37548_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/aac25591e869/41598_2018_37548_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/d120abc91741/41598_2018_37548_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/a4430ab8c4ba/41598_2018_37548_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/42086a444f72/41598_2018_37548_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/12d9ba9c35cd/41598_2018_37548_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/9ecaba816495/41598_2018_37548_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9bec/6367441/bfbb15fc1974/41598_2018_37548_Fig9_HTML.jpg

相似文献

1
Searchers adjust their eye-movement dynamics to target characteristics in natural scenes.搜索者调整其眼球运动动态以适应自然场景中的目标特征。
Sci Rep. 2019 Feb 7;9(1):1635. doi: 10.1038/s41598-018-37548-w.
2
Spatial frequency processing in the central and peripheral visual field during scene viewing.场景观看过程中中央和周边视野的空间频率处理
Vision Res. 2016 Oct;127:186-197. doi: 10.1016/j.visres.2016.05.008. Epub 2016 Sep 17.
3
Predictive activity in macaque frontal eye field neurons during natural scene searching.猕猴额眼区神经元在自然场景搜索中的预测活动。
J Neurophysiol. 2010 Mar;103(3):1238-52. doi: 10.1152/jn.00776.2009. Epub 2009 Dec 16.
4
Spatiotemporal Content of Saccade Transients.扫视瞬变的时空内容。
Curr Biol. 2020 Oct 19;30(20):3999-4008.e2. doi: 10.1016/j.cub.2020.07.085. Epub 2020 Sep 10.
5
Disentangling bottom-up versus top-down and low-level versus high-level influences on eye movements over time.随着时间的推移,厘清自下而上与自上而下以及低级与高级对眼动的影响。
J Vis. 2019 Mar 1;19(3):1. doi: 10.1167/19.3.1.
6
Visual signals contribute to the coding of gaze direction.视觉信号有助于注视方向的编码。
Exp Brain Res. 2002 Jun;144(3):281-92. doi: 10.1007/s00221-002-1029-5. Epub 2002 Apr 13.
7
Evidence for a role of corrective eye movements during gaze fixation in saccade planning.扫视计划中注视期间矫正性眼动作用的证据。
Eur J Neurosci. 2015 Jan;41(2):227-33. doi: 10.1111/ejn.12777. Epub 2014 Oct 31.
8
Parallel programming of saccades during natural scene viewing: evidence from eye movement positions.自然场景观看过程中扫视的并行编程:来自眼动位置的证据
J Vis. 2013 Oct 24;13(12):17. doi: 10.1167/13.12.17.
9
New insights into feature and conjunction search: I. Evidence from pupil size, eye movements and ageing.特征搜索和联合搜索的新见解:I. 来自瞳孔大小、眼动和年龄的证据。
Cortex. 2010 May;46(5):621-36. doi: 10.1016/j.cortex.2009.04.013. Epub 2009 Jun 12.
10
The maturation of eye movement behavior: scene viewing characteristics in children and adults.眼动行为的成熟:儿童与成人的场景观看特征
Vision Res. 2014 Oct;103:83-91. doi: 10.1016/j.visres.2014.08.006. Epub 2014 Aug 23.

引用本文的文献

1
Look twice: A generalist computational model predicts return fixations across tasks and species.多看两眼:一个通才计算模型可以预测跨任务和物种的返回注视点。
PLoS Comput Biol. 2022 Nov 22;18(11):e1010654. doi: 10.1371/journal.pcbi.1010654. eCollection 2022 Nov.
2
Saliency-Aware Subtle Augmentation Improves Human Visual Search Performance in VR.显著感知的细微增强可提升虚拟现实中的人类视觉搜索性能。
Brain Sci. 2021 Feb 25;11(3):283. doi: 10.3390/brainsci11030283.
3
Guided Search 6.0: An updated model of visual search.引导式搜索 6.0:一种更新的视觉搜索模型。

本文引用的文献

1
Disentangling bottom-up versus top-down and low-level versus high-level influences on eye movements over time.随着时间的推移,厘清自下而上与自上而下以及低级与高级对眼动的影响。
J Vis. 2019 Mar 1;19(3):1. doi: 10.1167/19.3.1.
2
Temporal evolution of the central fixation bias in scene viewing.场景观看中中央注视偏向的时间演变。
J Vis. 2017 Nov 1;17(13):3. doi: 10.1167/17.13.3.
3
An image-computable psychophysical spatial vision model.一种图像可计算的心理物理学空间视觉模型。
Psychon Bull Rev. 2021 Aug;28(4):1060-1092. doi: 10.3758/s13423-020-01859-9. Epub 2021 Feb 5.
4
Human visual search follows a suboptimal Bayesian strategy revealed by a spatiotemporal computational model and experiment.人类视觉搜索遵循一种次优贝叶斯策略,该策略由时空计算模型和实验揭示。
Commun Biol. 2021 Jan 4;4(1):34. doi: 10.1038/s42003-020-01485-0.
5
Modeling the effects of perisaccadic attention on gaze statistics during scene viewing.在场景观看过程中,对peri-saccadic 注意对视点统计的影响进行建模。
Commun Biol. 2020 Dec 1;3(1):727. doi: 10.1038/s42003-020-01429-8.
6
Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking.任务依赖的场景感知:使用移动眼动追踪进行无约束头部观看。
J Vis. 2020 May 11;20(5):3. doi: 10.1167/jov.20.5.3.
J Vis. 2017 Oct 1;17(12):12. doi: 10.1167/17.12.12.
4
Likelihood-based parameter estimation and comparison of dynamical cognitive models.基于似然性的动态认知模型参数估计与比较
Psychol Rev. 2017 Jul;124(4):505-524. doi: 10.1037/rev0000068. Epub 2017 Apr 27.
5
LATEST: A model of saccadic decisions in space and time.最新:一种时空扫视决策模型。
Psychol Rev. 2017 Apr;124(3):267-300. doi: 10.1037/rev0000054.
6
Influence of initial fixation position in scene viewing.场景观看中初始注视位置的影响。
Vision Res. 2016 Dec;129:33-49. doi: 10.1016/j.visres.2016.09.012. Epub 2016 Nov 1.
7
Stuck on semantics: Processing of irrelevant object-scene inconsistencies modulates ongoing gaze behavior.纠结于语义:无关的物体-场景不一致性的处理会调节正在进行的注视行为。
Atten Percept Psychophys. 2017 Jan;79(1):154-168. doi: 10.3758/s13414-016-1203-7.
8
The impending demise of the item in visual search.视觉搜索中目标物即将消亡。
Behav Brain Sci. 2017 Jan;40:e132. doi: 10.1017/S0140525X15002794. Epub 2015 Dec 17.
9
Information-theoretic model comparison unifies saliency metrics.信息论模型比较统一了显著性度量。
Proc Natl Acad Sci U S A. 2015 Dec 29;112(52):16054-9. doi: 10.1073/pnas.1510393112. Epub 2015 Dec 10.
10
Saccadic model of eye movements for free-viewing condition.自由观看条件下眼动的扫视模型。
Vision Res. 2015 Nov;116(Pt B):152-64. doi: 10.1016/j.visres.2014.12.026. Epub 2015 Feb 24.