• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于无人机室内导航的基于事件的视觉/惯性里程计

Event-Based Visual/Inertial Odometry for UAV Indoor Navigation.

作者信息

Elamin Ahmed, El-Rabbany Ahmed, Jacob Sunil

机构信息

Civil Engineering Department, Faculty of Engineering and Architectural Science, Toronto Metropolitan University, Toronto, ON M5B 2K3, Canada.

Civil Engineering Department, Faculty of Engineering, Zagazig University, Zagazig 10162, Egypt.

出版信息

Sensors (Basel). 2024 Dec 25;25(1):61. doi: 10.3390/s25010061.

DOI:10.3390/s25010061
PMID:39796852
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11722967/
Abstract

Indoor navigation is becoming increasingly essential for multiple applications. It is complex and challenging due to dynamic scenes, limited space, and, more importantly, the unavailability of global navigation satellite system (GNSS) signals. Recently, new sensors have emerged, namely event cameras, which show great potential for indoor navigation due to their high dynamic range and low latency. In this study, an event-based visual-inertial odometry approach is proposed, emphasizing adaptive event accumulation and selective keyframe updates to reduce computational overhead. The proposed approach fuses events, standard frames, and inertial measurements for precise indoor navigation. Features are detected and tracked on the standard images. The events are accumulated into frames and used to track the features between the standard frames. Subsequently, the IMU measurements and the feature tracks are fused to continuously estimate the sensor states. The proposed approach is evaluated using both simulated and real-world datasets. Compared with the state-of-the-art U-SLAM algorithm, our approach achieves a substantial reduction in the mean positional error and RMSE in simulated environments, showing up to 50% and 47% reductions along the - and -axes, respectively. The approach achieves 5-10 ms latency per event batch and 10-20 ms for frame updates, demonstrating real-time performance on resource-constrained platforms. These results underscore the potential of our approach as a robust solution for real-world UAV indoor navigation scenarios.

摘要

室内导航对于多种应用正变得越来越重要。由于动态场景、空间有限,更重要的是全球导航卫星系统(GNSS)信号不可用,室内导航既复杂又具有挑战性。最近,出现了新的传感器,即事件相机,由于其高动态范围和低延迟,在室内导航方面显示出巨大潜力。在本研究中,提出了一种基于事件的视觉惯性里程计方法,强调自适应事件累积和选择性关键帧更新以减少计算开销。所提出的方法融合事件、标准帧和惯性测量以实现精确的室内导航。在标准图像上检测和跟踪特征。事件被累积成帧并用于在标准帧之间跟踪特征。随后,融合IMU测量和特征轨迹以连续估计传感器状态。使用模拟数据集和真实世界数据集对所提出的方法进行评估。与最先进的U-SLAM算法相比,我们的方法在模拟环境中平均位置误差和均方根误差(RMSE)大幅降低,分别在x轴和y轴上显示出高达50%和47%的降幅。该方法每个事件批次的延迟为5 - 10毫秒,帧更新延迟为10 - 20毫秒,在资源受限平台上展示了实时性能。这些结果强调了我们的方法作为真实世界无人机室内导航场景的强大解决方案的潜力。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/8ab8451d0f2d/sensors-25-00061-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/39adf411c8dc/sensors-25-00061-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/c9b43f40921d/sensors-25-00061-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/b3f0f470df46/sensors-25-00061-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/faaa15f538e9/sensors-25-00061-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/48492edd4a13/sensors-25-00061-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/70f2f27aff87/sensors-25-00061-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/a89118982e14/sensors-25-00061-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/284bb6bb214a/sensors-25-00061-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/a0ca78f6e256/sensors-25-00061-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/d797ca7cea23/sensors-25-00061-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/1f42aeb2c5d1/sensors-25-00061-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/3f420d450bf2/sensors-25-00061-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/cc4b05b1209c/sensors-25-00061-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/8ab8451d0f2d/sensors-25-00061-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/39adf411c8dc/sensors-25-00061-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/c9b43f40921d/sensors-25-00061-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/b3f0f470df46/sensors-25-00061-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/faaa15f538e9/sensors-25-00061-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/48492edd4a13/sensors-25-00061-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/70f2f27aff87/sensors-25-00061-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/a89118982e14/sensors-25-00061-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/284bb6bb214a/sensors-25-00061-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/a0ca78f6e256/sensors-25-00061-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/d797ca7cea23/sensors-25-00061-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/1f42aeb2c5d1/sensors-25-00061-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/3f420d450bf2/sensors-25-00061-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/cc4b05b1209c/sensors-25-00061-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/25f4/11722967/8ab8451d0f2d/sensors-25-00061-g014.jpg

相似文献

1
Event-Based Visual/Inertial Odometry for UAV Indoor Navigation.用于无人机室内导航的基于事件的视觉/惯性里程计
Sensors (Basel). 2024 Dec 25;25(1):61. doi: 10.3390/s25010061.
2
Robust Stereo Visual Inertial Navigation System Based on Multi-Stage Outlier Removal in Dynamic Environments.基于动态环境下多阶段异常值剔除的鲁棒立体视觉惯性导航系统。
Sensors (Basel). 2020 May 21;20(10):2922. doi: 10.3390/s20102922.
3
A GNSS/INS/LiDAR Integration Scheme for UAV-Based Navigation in GNSS-Challenging Environments.一种用于 GNSS 挑战性环境中基于无人机的导航的 GNSS/INS/LiDAR 集成方案。
Sensors (Basel). 2022 Dec 16;22(24):9908. doi: 10.3390/s22249908.
4
Benefits of Multi-Constellation/Multi-Frequency GNSS in a Tightly Coupled GNSS/IMU/Odometry Integration Algorithm.多星座/多频率 GNSS 在紧耦合 GNSS/IMU/里程计组合算法中的优势。
Sensors (Basel). 2018 Sep 12;18(9):3052. doi: 10.3390/s18093052.
5
Deep Learning-Aided Inertial/Visual/LiDAR Integration for GNSS-Challenging Environments.深度学习辅助的惯性/视觉/激光雷达融合在 GNSS 挑战性环境下的应用。
Sensors (Basel). 2023 Jun 29;23(13):6019. doi: 10.3390/s23136019.
6
Optical and Mass Flow Sensors for Aiding Vehicle Navigation in GNSS Denied Environment.用于在全球导航卫星系统(GNSS)信号受阻环境中辅助车辆导航的光学和质量流量传感器。
Sensors (Basel). 2020 Nov 17;20(22):6567. doi: 10.3390/s20226567.
7
Radar and Visual Odometry Integrated System Aided Navigation for UAVS in GNSS Denied Environment.雷达和视觉里程计集成系统辅助 GNSS 拒止环境下的无人机导航。
Sensors (Basel). 2018 Aug 23;18(9):2776. doi: 10.3390/s18092776.
8
Event-based feature tracking in a visual inertial odometry framework.视觉惯性里程计框架中基于事件的特征跟踪
Front Robot AI. 2023 Feb 14;10:994488. doi: 10.3389/frobt.2023.994488. eCollection 2023.
9
Perception in the Dark-Development of a ToF Visual Inertial Odometry System.黑暗中的感知——一种飞行时间视觉惯性里程计系统的开发
Sensors (Basel). 2020 Feb 26;20(5):1263. doi: 10.3390/s20051263.
10
ESVIO: Event-Based Stereo Visual-Inertial Odometry.ESVIO:基于事件的立体视觉惯性里程计。
Sensors (Basel). 2023 Feb 10;23(4):1998. doi: 10.3390/s23041998.

引用本文的文献

1
A Novel Loosely Coupled Collaborative Localization Method Utilizing Integrated IMU-Aided Cameras for Multiple Autonomous Robots.一种利用集成惯性测量单元辅助相机的新型松散耦合协作定位方法用于多自主机器人
Sensors (Basel). 2025 May 13;25(10):3086. doi: 10.3390/s25103086.
2
Stereo Event-Based Visual-Inertial Odometry.基于立体事件的视觉惯性里程计
Sensors (Basel). 2025 Jan 31;25(3):887. doi: 10.3390/s25030887.

本文引用的文献

1
ESVIO: Event-Based Stereo Visual-Inertial Odometry.ESVIO:基于事件的立体视觉惯性里程计。
Sensors (Basel). 2023 Feb 10;23(4):1998. doi: 10.3390/s23041998.
2
An Outline of Multi-Sensor Fusion Methods for Mobile Agents Indoor Navigation.移动机器人室内导航的多传感器融合方法概述。
Sensors (Basel). 2021 Feb 25;21(5):1605. doi: 10.3390/s21051605.
3
Event-Based Vision: A Survey.基于事件的视觉:综述。
IEEE Trans Pattern Anal Mach Intell. 2022 Jan;44(1):154-180. doi: 10.1109/TPAMI.2020.3008413. Epub 2021 Dec 7.
4
A LiDAR and IMU Integrated Indoor Navigation System for UAVs and Its Application in Real-Time Pipeline Classification.一种用于无人机的激光雷达与惯性测量单元集成室内导航系统及其在实时管道分类中的应用
Sensors (Basel). 2017 Jun 2;17(6):1268. doi: 10.3390/s17061268.
5
A Motion-Based Feature for Event-Based Pattern Recognition.一种用于基于事件的模式识别的基于运动的特征。
Front Neurosci. 2017 Jan 4;10:594. doi: 10.3389/fnins.2016.00594. eCollection 2016.
6
Asynchronous Event-Based Multikernel Algorithm for High-Speed Visual Features Tracking.基于异步事件的多核算法在高速视觉特征跟踪中的应用。
IEEE Trans Neural Netw Learn Syst. 2015 Aug;26(8):1710-20. doi: 10.1109/TNNLS.2014.2352401. Epub 2014 Sep 16.