• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

人类的视觉探索减少了关于所感知世界的不确定性。

Human visual exploration reduces uncertainty about the sensed world.

作者信息

Mirza M Berk, Adams Rick A, Mathys Christoph, Friston Karl J

机构信息

Wellcome Trust Centre for Neuroimaging, Institute of Neurology, University College London, London, United Kingdom.

Institute of Cognitive Neuroscience, University College London, London, United Kingdom.

出版信息

PLoS One. 2018 Jan 5;13(1):e0190429. doi: 10.1371/journal.pone.0190429. eCollection 2018.

DOI:10.1371/journal.pone.0190429
PMID:29304087
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5755757/
Abstract

In previous papers, we introduced a normative scheme for scene construction and epistemic (visual) searches based upon active inference. This scheme provides a principled account of how people decide where to look, when categorising a visual scene based on its contents. In this paper, we use active inference to explain the visual searches of normal human subjects; enabling us to answer some key questions about visual foraging and salience attribution. First, we asked whether there is any evidence for 'epistemic foraging'; i.e. exploration that resolves uncertainty about a scene. In brief, we used Bayesian model comparison to compare Markov decision process (MDP) models of scan-paths that did-and did not-contain the epistemic, uncertainty-resolving imperatives for action selection. In the course of this model comparison, we discovered that it was necessary to include non-epistemic (heuristic) policies to explain observed behaviour (e.g., a reading-like strategy that involved scanning from left to right). Despite this use of heuristic policies, model comparison showed that there is substantial evidence for epistemic foraging in the visual exploration of even simple scenes. Second, we compared MDP models that did-and did not-allow for changes in prior expectations over successive blocks of the visual search paradigm. We found that implicit prior beliefs about the speed and accuracy of visual searches changed systematically with experience. Finally, we characterised intersubject variability in terms of subject-specific prior beliefs. Specifically, we used canonical correlation analysis to see if there were any mixtures of prior expectations that could predict between-subject differences in performance; thereby establishing a quantitative link between different behavioural phenotypes and Bayesian belief updating. We demonstrated that better scene categorisation performance is consistently associated with lower reliance on heuristics; i.e., a greater use of a generative model of the scene to direct its exploration.

摘要

在之前的论文中,我们基于主动推理引入了一种场景构建和认知(视觉)搜索的规范方案。该方案为人们在根据视觉场景内容进行分类时如何决定注视位置提供了一个有原则的解释。在本文中,我们使用主动推理来解释正常人类受试者的视觉搜索;这使我们能够回答一些关于视觉觅食和显著性归因的关键问题。首先,我们询问是否有“认知觅食”的证据;即解决关于场景不确定性的探索。简而言之,我们使用贝叶斯模型比较来比较扫描路径的马尔可夫决策过程(MDP)模型,这些模型包含和不包含用于动作选择的认知、不确定性解决指令。在这个模型比较过程中,我们发现有必要纳入非认知(启发式)策略来解释观察到的行为(例如,一种类似阅读的策略,涉及从左到右扫描)。尽管使用了启发式策略,但模型比较表明,即使在简单场景的视觉探索中,也有大量证据支持认知觅食。其次,我们比较了允许和不允许在视觉搜索范式的连续块中先验期望发生变化的MDP模型。我们发现,关于视觉搜索速度和准确性的隐含先验信念会随着经验而系统地改变。最后,我们根据受试者特定的先验信念来描述个体间的变异性。具体来说,我们使用典型相关分析来查看是否存在任何先验期望的组合可以预测受试者之间的表现差异;从而在不同的行为表型和贝叶斯信念更新之间建立定量联系。我们证明,更好的场景分类表现始终与对启发式策略的较低依赖相关;即更多地使用场景生成模型来指导其探索。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/b6742d92e34f/pone.0190429.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/f295d48c7014/pone.0190429.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/12b63d98c5ac/pone.0190429.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/30e8c0b8ebef/pone.0190429.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/813dcc42bf49/pone.0190429.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/030ea0a8348d/pone.0190429.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/320fa6d33b21/pone.0190429.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/2d94f67982a6/pone.0190429.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/b6742d92e34f/pone.0190429.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/f295d48c7014/pone.0190429.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/12b63d98c5ac/pone.0190429.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/30e8c0b8ebef/pone.0190429.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/813dcc42bf49/pone.0190429.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/030ea0a8348d/pone.0190429.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/320fa6d33b21/pone.0190429.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/2d94f67982a6/pone.0190429.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6073/5755757/b6742d92e34f/pone.0190429.g008.jpg

相似文献

1
Human visual exploration reduces uncertainty about the sensed world.人类的视觉探索减少了关于所感知世界的不确定性。
PLoS One. 2018 Jan 5;13(1):e0190429. doi: 10.1371/journal.pone.0190429. eCollection 2018.
2
Scene Construction, Visual Foraging, and Active Inference.场景构建、视觉觅食与主动推理
Front Comput Neurosci. 2016 Jun 14;10:56. doi: 10.3389/fncom.2016.00056. eCollection 2016.
3
Deep Active Inference and Scene Construction.深度主动推理与场景构建
Front Artif Intell. 2020 Oct 28;3:509354. doi: 10.3389/frai.2020.509354. eCollection 2020.
4
Uncertainty, epistemics and active inference.不确定性、认识论与主动推理。
J R Soc Interface. 2017 Nov;14(136). doi: 10.1098/rsif.2017.0376.
5
Active inference and epistemic value.主动推理与认知价值。
Cogn Neurosci. 2015;6(4):187-214. doi: 10.1080/17588928.2015.1020053. Epub 2015 Mar 13.
6
The Effectiveness of Integrated Care Pathways for Adults and Children in Health Care Settings: A Systematic Review.综合护理路径在医疗环境中对成人和儿童的有效性:一项系统评价。
JBI Libr Syst Rev. 2009;7(3):80-129. doi: 10.11124/01938924-200907030-00001.
7
Generalised free energy and active inference.广义自由能与主动推理
Biol Cybern. 2019 Dec;113(5-6):495-513. doi: 10.1007/s00422-019-00805-w. Epub 2019 Sep 27.
8
Deep temporal models and active inference.深部颞叶模型与主动推理
Neurosci Biobehav Rev. 2017 Jun;77:388-402. doi: 10.1016/j.neubiorev.2017.04.009. Epub 2017 Apr 14.
9
Why are the batteries in the microwave?: Use of semantic information under uncertainty in a search task.为什么电池在微波炉里?:在搜索任务中使用不确定情况下的语义信息。
Cogn Res Princ Implic. 2021 Apr 14;6(1):32. doi: 10.1186/s41235-021-00294-1.
10
"This Is What We Don't Know": Treating Epistemic Uncertainty in Bayesian Networks for Risk Assessment.“这是我们不知道的”:在风险评估的贝叶斯网络中处理认知不确定性。
Integr Environ Assess Manag. 2021 Jan;17(1):221-232. doi: 10.1002/ieam.4367. Epub 2020 Dec 3.

引用本文的文献

1
SPM-30 years and beyond.SPM - 30年及以后。
Cereb Cortex. 2025 Aug 1;35(8). doi: 10.1093/cercor/bhaf234.
2
Adaptive learning rate in dynamical binary environments: the signature of adaptive information processing.动态二元环境中的自适应学习率:自适应信息处理的特征
Cogn Neurodyn. 2024 Dec;18(6):4009-4031. doi: 10.1007/s11571-024-10128-7. Epub 2024 Oct 21.
3
Generative models for sequential dynamics in active inference.主动推理中序列动力学的生成模型。

本文引用的文献

1
Active Inference: A Process Theory.主动推理:一种过程理论。
Neural Comput. 2017 Jan;29(1):1-49. doi: 10.1162/NECO_a_00912. Epub 2016 Nov 21.
2
Scene Construction, Visual Foraging, and Active Inference.场景构建、视觉觅食与主动推理
Front Comput Neurosci. 2016 Jun 14;10:56. doi: 10.3389/fncom.2016.00056. eCollection 2016.
3
Active inference and learning.主动推理与学习
Cogn Neurodyn. 2024 Dec;18(6):3259-3272. doi: 10.1007/s11571-023-09963-x. Epub 2023 Apr 26.
4
A dual foveal-peripheral visual processing model implements efficient saccade selection.一种双中央凹-周边视觉处理模型实现了高效的扫视选择。
J Vis. 2020 Aug 3;20(8):22. doi: 10.1167/jov.20.8.22.
5
Cognitive effort and active inference.认知努力与主动推断。
Neuropsychologia. 2023 Jun 6;184:108562. doi: 10.1016/j.neuropsychologia.2023.108562. Epub 2023 Apr 18.
6
Active inference and the two-step task.主动推断与两步任务。
Sci Rep. 2022 Oct 21;12(1):17682. doi: 10.1038/s41598-022-21766-4.
7
Foraging for the self: Environment selection for agency inference.寻求自我:对能动性推断的环境选择。
Psychon Bull Rev. 2023 Apr;30(2):608-620. doi: 10.3758/s13423-022-02187-w. Epub 2022 Oct 11.
8
Beauty and Uncertainty as Transformative Factors: A Free Energy Principle Account of Aesthetic Diagnosis and Intervention in Gestalt Psychotherapy.美与不确定性作为变革性因素:格式塔心理治疗中审美诊断与干预的自由能原理阐释
Front Hum Neurosci. 2022 Jul 13;16:906188. doi: 10.3389/fnhum.2022.906188. eCollection 2022.
9
Embodied Object Representation Learning and Recognition.具身物体表征学习与识别
Front Neurorobot. 2022 Apr 14;16:840658. doi: 10.3389/fnbot.2022.840658. eCollection 2022.
10
How Psychedelic-Assisted Treatment Works in the Bayesian Brain.迷幻辅助疗法在贝叶斯大脑中如何发挥作用。
Front Psychiatry. 2022 Mar 8;13:812180. doi: 10.3389/fpsyt.2022.812180. eCollection 2022.
Neurosci Biobehav Rev. 2016 Sep;68:862-879. doi: 10.1016/j.neubiorev.2016.06.022. Epub 2016 Jun 29.
4
Bayesian model reduction and empirical Bayes for group (DCM) studies.用于群组(动态因果模型)研究的贝叶斯模型简化与经验贝叶斯方法
Neuroimage. 2016 Mar;128:413-431. doi: 10.1016/j.neuroimage.2015.11.015. Epub 2015 Nov 11.
5
Active inference and epistemic value.主动推理与认知价值。
Cogn Neurosci. 2015;6(4):187-214. doi: 10.1080/17588928.2015.1020053. Epub 2015 Mar 13.
6
Planning as inference.规划即推理。
Trends Cogn Sci. 2012 Oct;16(10):485-8. doi: 10.1016/j.tics.2012.08.006. Epub 2012 Aug 30.
7
Bayesian surprise attracts human attention.贝叶斯惊奇吸引人类注意力。
Vision Res. 2009 Jun;49(10):1295-306. doi: 10.1016/j.visres.2008.09.007. Epub 2008 Oct 19.
8
Variational free energy and the Laplace approximation.变分自由能与拉普拉斯近似
Neuroimage. 2007 Jan 1;34(1):220-34. doi: 10.1016/j.neuroimage.2006.08.035. Epub 2006 Oct 20.
9
Object appearance, disappearance, and attention prioritization in real-world scenes.现实场景中的物体外观、消失及注意力优先级
Psychon Bull Rev. 2005 Dec;12(6):1061-7. doi: 10.3758/bf03206444.
10
Prioritization of new objects in real-world scenes: evidence from eye movements.现实场景中新物体的优先级:来自眼动的证据。
J Exp Psychol Hum Percept Perform. 2005 Oct;31(5):857-68. doi: 10.1037/0096-1523.31.5.857.