• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

IT-SVO:受限移动设备中结合JS散度的改进型半直接单目视觉里程计

IT-SVO: Improved Semi-Direct Monocular Visual Odometry Combined with JS Divergence in Restricted Mobile Devices.

作者信息

Liu Chang, Zhao Jin, Sun Nianyi, Yang Qingrong, Wang Leilei

机构信息

School of Mechanical Engineering, Guizhou University, Guiyang 550025, China.

Key Laboratory of Advanced Manufacturing Technology, Ministry of Education, Guizhou University, Guiyang 550025, China.

出版信息

Sensors (Basel). 2021 Mar 12;21(6):2025. doi: 10.3390/s21062025.

DOI:10.3390/s21062025
PMID:33809347
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7998773/
Abstract

Simultaneous localization and mapping (SLAM) has a wide range for applications in mobile robotics. Lightweight and inexpensive vision sensors have been widely used for localization in GPS-denied or weak GPS environments. Mobile robots not only estimate their pose, but also correct their position according to the environment, so a proper mathematical model is required to obtain the state of robots in their circumstances. Usually, filter-based SLAM/VO regards the model as a Gaussian distribution in the mapping thread, which deals with the complicated relationship between mean and covariance. The covariance in SLAM or VO represents the uncertainty of map points. Therefore, the methods, such as probability theory and information theory play a significant role in estimating the uncertainty. In this paper, we combine information theory with classical visual odometry (SVO) and take Jensen-Shannon divergence (JS divergence) instead of Kullback-Leibler divergence ( divergence) to estimate the uncertainty of depth. A more suitable methodology for SVO is that explores to improve the accuracy and robustness of mobile devices in unknown environments. Meanwhile, this paper aims to efficiently utilize small portability for location and provide a priori knowledge of the latter application scenario. Therefore, combined with SVO, JS divergence is implemented, which has been realized. It not only has the property of accurate distinction of outliers, but also converges the inliers quickly. Simultaneously, the results show, under the same computational simulation, that SVO combined with JS divergence can more accurately locate its state in the environment than the combination with divergence.

摘要

同步定位与地图构建(SLAM)在移动机器人领域有着广泛的应用。轻便且廉价的视觉传感器已被广泛用于全球定位系统(GPS)信号缺失或微弱的环境中的定位。移动机器人不仅要估计自身位姿,还要根据环境校正其位置,因此需要一个合适的数学模型来获取机器人在其环境中的状态。通常,基于滤波器的SLAM/视觉里程计(VO)在地图构建线程中将模型视为高斯分布,该线程处理均值和协方差之间的复杂关系。SLAM或VO中的协方差表示地图点的不确定性。因此,诸如概率论和信息论等方法在估计不确定性方面发挥着重要作用。在本文中,我们将信息论与经典视觉里程计(SVO)相结合,采用 Jensen-Shannon 散度(JS 散度)而非 Kullback-Leibler 散度(KL 散度)来估计深度的不确定性。一种更适合 SVO 的方法是探索提高移动设备在未知环境中的准确性和鲁棒性。同时,本文旨在有效利用小型便携性进行定位,并为先验应用场景提供先验知识。因此,结合 SVO 实现了 JS 散度,且已实现。它不仅具有准确区分异常值的特性,还能快速收敛内点。同时,结果表明,在相同的计算模拟下,与 KL 散度相结合相比,SVO 与 JS 散度相结合能够更准确地在环境中定位其状态。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/e1c8eb5d1d13/sensors-21-02025-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/295a4203a35a/sensors-21-02025-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/aefd07f2e79b/sensors-21-02025-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/a53ee5d5a02f/sensors-21-02025-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/da972f23d74d/sensors-21-02025-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/ebd5131d89d4/sensors-21-02025-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/96d3f3f1dfcf/sensors-21-02025-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/0c32857d3f47/sensors-21-02025-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/883612d5a236/sensors-21-02025-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/455f36d43212/sensors-21-02025-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/04f63b75c6f1/sensors-21-02025-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/f940415f9fcc/sensors-21-02025-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/5506e62506da/sensors-21-02025-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/860bb6fe9955/sensors-21-02025-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/5d63abf2e671/sensors-21-02025-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/6e0349c3debf/sensors-21-02025-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/2e2cc4aeca10/sensors-21-02025-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/71568f8b66a3/sensors-21-02025-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/c73132d90def/sensors-21-02025-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/321ec333a464/sensors-21-02025-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/83c399a4ad2f/sensors-21-02025-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/e1c8eb5d1d13/sensors-21-02025-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/295a4203a35a/sensors-21-02025-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/aefd07f2e79b/sensors-21-02025-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/a53ee5d5a02f/sensors-21-02025-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/da972f23d74d/sensors-21-02025-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/ebd5131d89d4/sensors-21-02025-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/96d3f3f1dfcf/sensors-21-02025-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/0c32857d3f47/sensors-21-02025-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/883612d5a236/sensors-21-02025-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/455f36d43212/sensors-21-02025-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/04f63b75c6f1/sensors-21-02025-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/f940415f9fcc/sensors-21-02025-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/5506e62506da/sensors-21-02025-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/860bb6fe9955/sensors-21-02025-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/5d63abf2e671/sensors-21-02025-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/6e0349c3debf/sensors-21-02025-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/2e2cc4aeca10/sensors-21-02025-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/71568f8b66a3/sensors-21-02025-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/c73132d90def/sensors-21-02025-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/321ec333a464/sensors-21-02025-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/83c399a4ad2f/sensors-21-02025-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/52f2/7998773/e1c8eb5d1d13/sensors-21-02025-g021.jpg

相似文献

1
IT-SVO: Improved Semi-Direct Monocular Visual Odometry Combined with JS Divergence in Restricted Mobile Devices.IT-SVO:受限移动设备中结合JS散度的改进型半直接单目视觉里程计
Sensors (Basel). 2021 Mar 12;21(6):2025. doi: 10.3390/s21062025.
2
Adaptive Monocular Visual-Inertial SLAM for Real-Time Augmented Reality Applications in Mobile Devices.适用于移动设备实时增强现实应用的自适应单目视觉惯性同步定位与地图构建
Sensors (Basel). 2017 Nov 7;17(11):2567. doi: 10.3390/s17112567.
3
A Multi-Sensorial Simultaneous Localization and Mapping (SLAM) System for Low-Cost Micro Aerial Vehicles in GPS-Denied Environments.一种用于全球定位系统(GPS)受限环境下低成本微型飞行器的多感官同步定位与地图构建(SLAM)系统。
Sensors (Basel). 2017 Apr 8;17(4):802. doi: 10.3390/s17040802.
4
Stereo Visual Odometry Pose Correction through Unsupervised Deep Learning.通过无监督深度学习进行立体视觉里程计位姿校正。
Sensors (Basel). 2021 Jul 11;21(14):4735. doi: 10.3390/s21144735.
5
On the Jensen-Shannon Symmetrization of Distances Relying on Abstract Means.关于基于抽象均值的距离的詹森 - 香农对称化
Entropy (Basel). 2019 May 11;21(5):485. doi: 10.3390/e21050485.
6
Semantic visual simultaneous localization and mapping (SLAM) using deep learning for dynamic scenes.使用深度学习的语义视觉同步定位与地图构建(SLAM)用于动态场景。
PeerJ Comput Sci. 2023 Oct 10;9:e1628. doi: 10.7717/peerj-cs.1628. eCollection 2023.
7
Monocular Visual SLAM Based on a Cooperative UAV-Target System.基于无人机-目标协作系统的单目视觉同步定位与地图构建
Sensors (Basel). 2020 Jun 22;20(12):3531. doi: 10.3390/s20123531.
8
SDVL: Efficient and Accurate Semi-Direct Visual Localization.SDVL:高效准确的半直接视觉定位。
Sensors (Basel). 2019 Jan 14;19(2):302. doi: 10.3390/s19020302.
9
A Monocular Visual Odometry Method Based on Virtual-Real Hybrid Map in Low-Texture Outdoor Environment.一种基于虚实混合地图的低纹理户外环境单目视觉里程计方法。
Sensors (Basel). 2021 May 13;21(10):3394. doi: 10.3390/s21103394.
10
Kullback-Leibler Divergence Based Probabilistic Approach for Device-Free Localization Using Channel State Information.基于 Kullback-Leibler 散度的概率方法在使用信道状态信息的无设备定位中的应用。
Sensors (Basel). 2019 Nov 3;19(21):4783. doi: 10.3390/s19214783.

引用本文的文献

1
Multi-Sensor Fusion for Wheel-Inertial-Visual Systems Using a Fuzzification-Assisted Iterated Error State Kalman Filter.基于模糊化辅助迭代误差状态卡尔曼滤波器的轮式惯性视觉系统多传感器融合
Sensors (Basel). 2024 Nov 28;24(23):7619. doi: 10.3390/s24237619.
2
Monocular visual SLAM, visual odometry, and structure from motion methods applied to 3D reconstruction: A comprehensive survey.应用于三维重建的单目视觉同步定位与地图构建、视觉里程计及运动恢复结构方法:全面综述。
Heliyon. 2024 Sep 6;10(18):e37356. doi: 10.1016/j.heliyon.2024.e37356. eCollection 2024 Sep 30.

本文引用的文献

1
Robust RGB-D SLAM Using Point and Line Features for Low Textured Scene.基于点线特征的鲁棒RGB-D SLAM用于低纹理场景
Sensors (Basel). 2020 Sep 2;20(17):4984. doi: 10.3390/s20174984.
2
Visual Information Fusion through Bayesian Inference for Adaptive Probability-Oriented Feature Matching.基于贝叶斯推断的自适应概率导向特征匹配的视觉信息融合。
Sensors (Basel). 2018 Jun 26;18(7):2041. doi: 10.3390/s18072041.
3
Direct Sparse Odometry.直接稀疏里程计。
IEEE Trans Pattern Anal Mach Intell. 2018 Mar;40(3):611-625. doi: 10.1109/TPAMI.2017.2658577. Epub 2017 Apr 12.
4
MonoSLAM: real-time single camera SLAM.单目即时定位与地图构建(MonoSLAM):实时单目相机即时定位与地图构建
IEEE Trans Pattern Anal Mach Intell. 2007 Jun;29(6):1052-67. doi: 10.1109/TPAMI.2007.1049.