Department of Neuroscience, University of Rochester Medical Center, Rochester, NY 14642, USA.
Department of Neuroscience, University of Rochester Medical Center, Rochester, NY 14642, USA.
Curr Biol. 2023 Jun 19;33(12):2557-2565.e4. doi: 10.1016/j.cub.2023.05.032. Epub 2023 Jun 5.
Primates have evolved sophisticated, visually guided reaching behaviors for interacting with dynamic objects, such as insects, during foraging. Reaching control in dynamic natural conditions requires active prediction of the target's future position to compensate for visuo-motor processing delays and to enhance online movement adjustments. Past reaching research in non-human primates mainly focused on seated subjects engaged in repeated ballistic arm movements to either stationary targets or targets that instantaneously change position during the movement. However, those approaches impose task constraints that limit the natural dynamics of reaching. A recent field study in marmoset monkeys highlights predictive aspects of visually guided reaching during insect prey capture among wild marmoset monkeys. To examine the complementary dynamics of similar natural behavior within a laboratory context, we developed an ecologically motivated, unrestrained reach-to-grasp task involving live crickets. We used multiple high-speed video cameras to capture the movements of common marmosets (Callithrix jacchus) and crickets stereoscopically and applied machine vision algorithms for marker-free object and hand tracking. Contrary to estimates under traditional constrained reaching paradigms, we find that reaching for dynamic targets can operate at incredibly short visuo-motor delays around 80 ms, rivaling the speeds that are typical of the oculomotor systems during closed-loop visual pursuit. Multivariate linear regression modeling of the kinematic relationships between the hand and cricket velocity revealed that predictions of the expected future location can compensate for visuo-motor delays during fast reaching. These results suggest a critical role of visual prediction facilitating online movement adjustments for dynamic prey.
灵长类动物在觅食过程中进化出了复杂的、受视觉引导的伸手行为,以便与动态物体(如昆虫)互动。在动态自然条件下进行伸手控制需要主动预测目标的未来位置,以补偿视动处理延迟,并增强在线运动调整。过去对非人类灵长类动物的伸手研究主要集中在参与重复弹丸手臂运动的坐姿受试者上,这些运动的目标要么是静止的,要么是在运动过程中瞬间改变位置的。然而,这些方法施加了任务约束,限制了伸手的自然动态。最近对狨猴的一项野外研究强调了在野生狨猴捕捉昆虫猎物期间视觉引导伸手的预测方面。为了在实验室环境中检查类似自然行为的互补动态,我们开发了一个受生态启发的、不受约束的伸手抓握任务,涉及活体蟋蟀。我们使用多个高速摄像机立体捕捉普通狨猴(Callithrix jacchus)和蟋蟀的运动,并应用机器视觉算法进行无标记物体和手跟踪。与传统约束伸手范式下的估计相反,我们发现,对于动态目标的伸手可以在令人难以置信的短视动延迟下运行,大约 80 毫秒,与闭环视觉追踪期间眼球运动系统的典型速度相媲美。对手和蟋蟀速度之间运动关系的多元线性回归建模表明,对预期未来位置的预测可以在快速伸手时补偿视动延迟。这些结果表明,视觉预测在促进动态猎物的在线运动调整方面起着关键作用。