• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

相似文献

1
Integration time for the perception of depth from motion parallax.通过运动视差感知深度的整合时间。
Vision Res. 2012 Apr 15;59:64-71. doi: 10.1016/j.visres.2012.02.007. Epub 2012 Mar 1.
2
Motion parallax thresholds for unambiguous depth perception.用于明确深度感知的运动视差阈值。
Vision Res. 2015 Oct;115(Pt A):40-7. doi: 10.1016/j.visres.2015.07.002. Epub 2015 Aug 22.
3
Aging does not affect integration times for the perception of depth from motion parallax.衰老并不影响通过运动视差感知深度的整合时间。
Vision Res. 2017 Nov;140:81-88. doi: 10.1016/j.visres.2017.05.016. Epub 2017 Sep 5.
4
Visual depth from motion parallax and eye pursuit.基于运动视差和眼球追踪的视觉深度
J Math Biol. 2012 Jun;64(7):1157-88. doi: 10.1007/s00285-011-0445-1. Epub 2011 Jun 22.
5
The pursuit theory of motion parallax.运动视差的追踪理论。
Vision Res. 2006 Dec;46(28):4709-25. doi: 10.1016/j.visres.2006.07.006. Epub 2006 Nov 2.
6
The motion/pursuit law for visual depth perception from motion parallax.基于运动视差的视觉深度感知的运动/追踪定律。
Vision Res. 2009 Jul;49(15):1969-78. doi: 10.1016/j.visres.2009.05.008. Epub 2009 May 20.
7
A functional link between MT neurons and depth perception based on motion parallax.基于运动视差的MT神经元与深度感知之间的功能联系。
J Neurosci. 2015 Feb 11;35(6):2766-77. doi: 10.1523/JNEUROSCI.3134-14.2015.
8
The role of eye movements in depth from motion parallax during infancy.婴儿期眼动在运动视差深度感知中的作用。
J Vis. 2013 Dec 18;13(14):15. doi: 10.1167/13.14.15.
9
A Pursuit Theory Account for the Perception of Common Motion in Motion Parallax.一种关于运动视差中共同运动感知的追踪理论解释
Perception. 2016 Sep;45(9):991-1007. doi: 10.1177/0301006616643679. Epub 2016 Apr 7.
10
The effects of aging on the perception of depth from motion parallax.衰老对基于运动视差的深度感知的影响。
Atten Percept Psychophys. 2016 Aug;78(6):1681-91. doi: 10.3758/s13414-016-1134-3.

引用本文的文献

1
Flexible computation of object motion and depth based on viewing geometry inferred from optic flow.基于从光流推断出的视觉几何对物体运动和深度进行灵活计算。
bioRxiv. 2025 May 19:2024.10.29.620928. doi: 10.1101/2024.10.29.620928.
2
Aging does not affect integration times for the perception of depth from motion parallax.衰老并不影响通过运动视差感知深度的整合时间。
Vision Res. 2017 Nov;140:81-88. doi: 10.1016/j.visres.2017.05.016. Epub 2017 Sep 5.
3
Motion parallax thresholds for unambiguous depth perception.用于明确深度感知的运动视差阈值。
Vision Res. 2015 Oct;115(Pt A):40-7. doi: 10.1016/j.visres.2015.07.002. Epub 2015 Aug 22.
4
In pursuit of perspective: does vertical perspective disambiguate depth from motion parallax?追求视角:垂直视角能否从运动视差中区分深度?
Perception. 2013;42(6):631-41. doi: 10.1068/p7250.
5
The role of eye movements in depth from motion parallax during infancy.婴儿期眼动在运动视差深度感知中的作用。
J Vis. 2013 Dec 18;13(14):15. doi: 10.1167/13.14.15.
6
Visual depth from motion parallax and eye pursuit.基于运动视差和眼球追踪的视觉深度
J Math Biol. 2012 Jun;64(7):1157-88. doi: 10.1007/s00285-011-0445-1. Epub 2011 Jun 22.

本文引用的文献

1
Visual depth from motion parallax and eye pursuit.基于运动视差和眼球追踪的视觉深度
J Math Biol. 2012 Jun;64(7):1157-88. doi: 10.1007/s00285-011-0445-1. Epub 2011 Jun 22.
2
Eye movements: the past 25 years.眼球运动:过去25年
Vision Res. 2011 Jul 1;51(13):1457-83. doi: 10.1016/j.visres.2010.12.014. Epub 2011 Jan 13.
3
Perception of three-dimensional structure from motion.运动视差中的三维结构感知。
Trends Cogn Sci. 1998 Jun 1;2(6):222-8. doi: 10.1016/s1364-6613(98)01181-4.
4
MT neurons combine visual motion with a smooth eye movement signal to code depth-sign from motion parallax.MT神经元将视觉运动与平稳的眼球运动信号相结合,以编码来自运动视差的深度信号。
Neuron. 2009 Aug 27;63(4):523-32. doi: 10.1016/j.neuron.2009.07.029.
5
Spatial allocation of attention during smooth pursuit eye movements.平稳跟踪眼球运动过程中注意力的空间分配。
Vision Res. 2009 Jun;49(10):1275-85. doi: 10.1016/j.visres.2009.01.011.
6
The motion/pursuit law for visual depth perception from motion parallax.基于运动视差的视觉深度感知的运动/追踪定律。
Vision Res. 2009 Jul;49(15):1969-78. doi: 10.1016/j.visres.2009.05.008. Epub 2009 May 20.
7
Oculomotor capture by transient events: a comparison of abrupt onsets, offsets, motion, and flicker.瞬态事件引起的动眼捕捉:突发起始、终止、运动及闪烁的比较
J Vis. 2008 Oct 23;8(14):11.1-16. doi: 10.1167/8.14.11.
8
Evidence for a link between the extra-retinal component of random-onset pursuit and the anticipatory pursuit of predictable object motion.随机起始追踪的视网膜外成分与可预测物体运动的预期追踪之间存在联系的证据。
J Neurophysiol. 2008 Aug;100(2):1135-46. doi: 10.1152/jn.00060.2008. Epub 2008 Jul 2.
9
Motion parallax contribution to perception of self-motion and depth.运动视差对自我运动感知和深度感知的贡献。
Biol Cybern. 2008 Apr;98(4):273-93. doi: 10.1007/s00422-008-0224-2. Epub 2008 Mar 26.
10
A neural representation of depth from motion parallax in macaque visual cortex.猕猴视觉皮层中基于运动视差的深度神经表征。
Nature. 2008 Apr 3;452(7187):642-5. doi: 10.1038/nature06814. Epub 2008 Mar 16.

通过运动视差感知深度的整合时间。

Integration time for the perception of depth from motion parallax.

作者信息

Nawrot Mark, Stroyan Keith

机构信息

Center for Visual Neuroscience, Department of Psychology, North Dakota State University, Fargo, ND 58108, USA.

出版信息

Vision Res. 2012 Apr 15;59:64-71. doi: 10.1016/j.visres.2012.02.007. Epub 2012 Mar 1.

DOI:10.1016/j.visres.2012.02.007
PMID:22406543
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC3349336/
Abstract

The perception of depth from relative motion is believed to be a slow process that "builds-up" over a period of observation. However, in the case of motion parallax, the potential accuracy of the depth estimate suffers as the observer translates during the viewing period. Our recent quantitative model for the perception of depth from motion parallax proposes that relative object depth (d) can be determined from retinal image motion (dθ/dt), pursuit eye movement (dα/dt), and fixation distance (f) by the formula: d/f≈dθ/dα. Given the model's dynamics, it is important to know the integration time required by the visual system to recover dα and dθ, and then estimate d. Knowing the minimum integration time reveals the incumbent error in this process. A depth-phase discrimination task was used to determine the time necessary to perceive depth-sign from motion parallax. Observers remained stationary and viewed a briefly translating random-dot motion parallax stimulus. Stimulus duration varied between trials. Fixation on the translating stimulus was monitored and enforced with an eye-tracker. The study found that relative depth discrimination can be performed with presentations as brief as 16.6 ms, with only two stimulus frames providing both retinal image motion and the stimulus window motion for pursuit (mean range=16.6-33.2 ms). This was found for conditions in which, prior to stimulus presentation, the eye was engaged in ongoing pursuit or the eye was stationary. A large high-contrast masking stimulus disrupted depth-discrimination for stimulus presentations less than 70-75 ms in both pursuit and stationary conditions. This interval might be linked to ocular-following response eye-movement latencies. We conclude that neural mechanisms serving depth from motion parallax generate a depth estimate much more quickly than previously believed. We propose that additional sluggishness might be due to the visual system's attempt to determine the maximum dθ/dα ratio for a selection of points on a complicated stimulus.

摘要

从相对运动中感知深度被认为是一个缓慢的过程,它会在一段时间的观察中“积累”起来。然而,在运动视差的情况下,由于观察者在观察期间进行平移,深度估计的潜在准确性会受到影响。我们最近提出的关于从运动视差中感知深度的定量模型表明,相对物体深度(d)可以通过视网膜图像运动(dθ/dt)、追踪眼球运动(dα/dt)和注视距离(f),利用公式d/f≈dθ/dα来确定。鉴于该模型的动态特性,了解视觉系统恢复dα和dθ并进而估计d所需的积分时间非常重要。了解最小积分时间可以揭示这一过程中存在的误差。我们使用了一项深度相位辨别任务来确定从运动视差中感知深度信号所需的时间。观察者保持静止,观看一个短暂平移的随机点运动视差刺激。每次试验中刺激持续时间不同。使用眼动仪监测并强制观察者注视平移刺激。研究发现,相对深度辨别可以在短至16.6毫秒的呈现时间内完成,仅两个刺激帧就能同时提供视网膜图像运动和用于追踪的刺激窗口运动(平均范围 = 16.6 - 33.2毫秒)。在刺激呈现之前眼睛处于持续追踪状态或眼睛静止的条件下均发现了这一结果。在追踪和静止条件下,一个大的高对比度掩蔽刺激会干扰持续时间小于70 - 75毫秒的刺激呈现的深度辨别。这个时间间隔可能与眼球跟随反应的眼动潜伏期有关。我们得出结论,用于从运动视差中感知深度的神经机制生成深度估计的速度比之前认为的要快得多。我们提出,额外的迟缓可能是由于视觉系统试图为复杂刺激上的一组点确定最大dθ/dα比率。