• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

主动着陆动作中基于光流的单眼距离估计。

Monocular distance estimation from optic flow during active landing maneuvers.

作者信息

van Breugel Floris, Morgansen Kristi, Dickinson Michael H

机构信息

California Institute of Technology, Pasadena, CA, USA.

出版信息

Bioinspir Biomim. 2014 Jun;9(2):025002. doi: 10.1088/1748-3182/9/2/025002. Epub 2014 May 22.

DOI:10.1088/1748-3182/9/2/025002
PMID:24855045
Abstract

Vision is arguably the most widely used sensor for position and velocity estimation in animals, and it is increasingly used in robotic systems as well. Many animals use stereopsis and object recognition in order to make a true estimate of distance. For a tiny insect such as a fruit fly or honeybee, however, these methods fall short. Instead, an insect must rely on calculations of optic flow, which can provide a measure of the ratio of velocity to distance, but not either parameter independently. Nevertheless, flies and other insects are adept at landing on a variety of substrates, a behavior that inherently requires some form of distance estimation in order to trigger distance-appropriate motor actions such as deceleration or leg extension. Previous studies have shown that these behaviors are indeed under visual control, raising the question: how does an insect estimate distance solely using optic flow? In this paper we use a nonlinear control theoretic approach to propose a solution for this problem. Our algorithm takes advantage of visually controlled landing trajectories that have been observed in flies and honeybees. Finally, we implement our algorithm, which we term dynamic peering, using a camera mounted to a linear stage to demonstrate its real-world feasibility.

摘要

视觉可以说是动物用于位置和速度估计的最广泛使用的传感器,并且它在机器人系统中也越来越多地被使用。许多动物利用立体视觉和物体识别来对距离进行准确估计。然而,对于果蝇或蜜蜂这样的微小昆虫来说,这些方法并不适用。相反,昆虫必须依靠光流计算,光流可以提供速度与距离的比率,但不能单独提供任何一个参数。尽管如此,苍蝇和其他昆虫擅长降落在各种基质上,这种行为本质上需要某种形式的距离估计,以便触发诸如减速或腿部伸展等与距离相适应的运动动作。先前的研究表明,这些行为确实受视觉控制,这就引出了一个问题:昆虫如何仅利用光流来估计距离?在本文中,我们使用非线性控制理论方法为这个问题提出一个解决方案。我们的算法利用了在苍蝇和蜜蜂中观察到的受视觉控制的着陆轨迹。最后,我们使用安装在线性平台上的摄像头实现了我们的算法,我们将其称为动态凝视,以证明其在现实世界中的可行性。

相似文献

1
Monocular distance estimation from optic flow during active landing maneuvers.主动着陆动作中基于光流的单眼距离估计。
Bioinspir Biomim. 2014 Jun;9(2):025002. doi: 10.1088/1748-3182/9/2/025002. Epub 2014 May 22.
2
Monocular distance estimation with optical flow maneuvers and efference copies: a stability-based strategy.基于光流操纵和传出副本的单眼距离估计:一种基于稳定性的策略。
Bioinspir Biomim. 2016 Jan 7;11(1):016004. doi: 10.1088/1748-3190/11/1/016004.
3
A direct optic flow-based strategy for inverse flight altitude estimation with monocular vision and IMU measurements.基于直接光流的策略,用于利用单目视觉和 IMU 测量进行逆飞行高度估计。
Bioinspir Biomim. 2018 Mar 20;13(3):036004. doi: 10.1088/1748-3190/aaa2be.
4
A μ analysis-based, controller-synthesis framework for robust bioinspired visual navigation in less-structured environments.一种基于μ分析的控制器综合框架,用于在结构较少的环境中进行鲁棒的生物启发式视觉导航。
Bioinspir Biomim. 2014 Jun;9(2):025011. doi: 10.1088/1748-3182/9/2/025011. Epub 2014 May 22.
5
Controlling free flight of a robotic fly using an onboard vision sensor inspired by insect ocelli.利用受昆虫复眼启发的机载视觉传感器来控制机器苍蝇的自由飞行。
J R Soc Interface. 2014 Aug 6;11(97):20140281. doi: 10.1098/rsif.2014.0281.
6
Flying over uneven moving terrain based on optic-flow cues without any need for reference frames or accelerometers.基于光流线索在不平坦的移动地形上飞行,无需任何参考系或加速度计。
Bioinspir Biomim. 2015 Feb 26;10(2):026003. doi: 10.1088/1748-3182/10/2/026003.
7
Optic flow-based collision-free strategies: From insects to robots.基于光流的无碰撞策略:从昆虫到机器人。
Arthropod Struct Dev. 2017 Sep;46(5):703-717. doi: 10.1016/j.asd.2017.06.003. Epub 2017 Jul 11.
8
A test bed for insect-inspired robotic control.一个用于受昆虫启发的机器人控制的试验台。
Philos Trans A Math Phys Eng Sci. 2003 Oct 15;361(1811):2267-85. doi: 10.1098/rsta.2003.1259.
9
A bio-inspired flying robot sheds light on insect piloting abilities.一种受生物启发的飞行机器人揭示了昆虫的飞行操控能力。
Curr Biol. 2007 Feb 20;17(4):329-35. doi: 10.1016/j.cub.2006.12.032. Epub 2007 Feb 8.
10
Robust post-stall perching with a simple fixed-wing glider using LQR-Trees.使用线性二次型调节器树(LQR-Trees)的简单固定翼滑翔机实现稳健的失速后栖息。
Bioinspir Biomim. 2014 Jun;9(2):025013. doi: 10.1088/1748-3182/9/2/025013. Epub 2014 May 22.

引用本文的文献

1
A compact multisensory representation of self-motion is sufficient for computing an external world variable.自我运动的紧凑多感官表征足以用于计算外部世界变量。
bioRxiv. 2025 May 9:2025.05.09.653128. doi: 10.1101/2025.05.09.653128.
2
Wind gates olfaction-driven search states in free flight.风门将嗅觉驱动的搜索状态带入自由飞行。
Curr Biol. 2024 Oct 7;34(19):4397-4411.e6. doi: 10.1016/j.cub.2024.07.009. Epub 2024 Jul 26.
3
Bumblebees compensate for the adverse effects of sidewind during visually guided landings.大黄蜂在视觉引导着陆过程中会补偿侧风的不利影响。
J Exp Biol. 2024 Apr 15;227(8). doi: 10.1242/jeb.245432. Epub 2024 Apr 22.
4
A Nonlinear Observability Analysis of Ambient Wind Estimation with Uncalibrated Sensors, Inspired by Insect Neural Encoding.受昆虫神经编码启发的基于未校准传感器的环境风估计的非线性可观测性分析
Proc IEEE Conf Decis Control. 2021 Dec;2021:1399-1406. doi: 10.1109/cdc45484.2021.9683219. Epub 2022 Feb 1.
5
Visual guidance of honeybees approaching a vertical landing surface.蜜蜂接近垂直着陆面的视觉引导。
J Exp Biol. 2023 Sep 1;226(17). doi: 10.1242/jeb.245956.
6
Lessons from natural flight for aviation: then, now and tomorrow.从自然飞行中汲取航空灵感:过去、现在与未来。
J Exp Biol. 2023 Apr 25;226(Suppl_1). doi: 10.1242/jeb.245409. Epub 2023 Apr 17.
7
Active anemosensing hypothesis: how flying insects could estimate ambient wind direction through sensory integration and active movement.主动感知假说:飞行昆虫如何通过感觉整合和主动运动来估计环境风向。
J R Soc Interface. 2022 Aug;19(193):20220258. doi: 10.1098/rsif.2022.0258. Epub 2022 Aug 31.
8
ARTFLOW: A Fast, Biologically Inspired Neural Network that Learns Optic Flow Templates for Self-Motion Estimation.ARTFLOW:一种快速、受生物启发的神经网络,用于学习用于自运动估计的光流模板。
Sensors (Basel). 2021 Dec 8;21(24):8217. doi: 10.3390/s21248217.
9
Oscillations make a self-scaled model for honeybees' visual odometer reliable regardless of flight trajectory.无论飞行轨迹如何,摆动都使蜜蜂的视觉里程计的自缩放模型变得可靠。
J R Soc Interface. 2021 Sep;18(182):20210567. doi: 10.1098/rsif.2021.0567. Epub 2021 Sep 8.
10
Insect inspired vision-based velocity estimation through spatial pooling of optic flow during linear motion.基于昆虫视觉的速度估计,通过线性运动期间光流的空间池化。
Bioinspir Biomim. 2021 Sep 9;16(6). doi: 10.1088/1748-3190/ac1f7b.