• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种基于动态场景中密集光流的移动机器人鲁棒半直接三维同步定位与地图构建方法

A Robust Semi-Direct 3D SLAM for Mobile Robot Based on Dense Optical Flow in Dynamic Scenes.

作者信息

Hu Bo, Luo Jingwen

机构信息

School of Information Science and Technology, Yunnan Normal University, No. 768 Juxian Street, Chenggong District, Kunming 650500, China.

出版信息

Biomimetics (Basel). 2023 Aug 16;8(4):371. doi: 10.3390/biomimetics8040371.

DOI:10.3390/biomimetics8040371
PMID:37622976
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10452154/
Abstract

Dynamic objects bring about a large number of error accumulations in pose estimation of mobile robots in dynamic scenes, and result in the failure to build a map that is consistent with the surrounding environment. Along these lines, this paper presents a robust semi-direct 3D simultaneous localization and mapping (SLAM) algorithm for mobile robots based on dense optical flow. First, a preliminary estimation of the robot's pose is conducted using the sparse direct method and the homography matrix is utilized to compensate for the current frame image to reduce the image deformation caused by rotation during the robot's motion. Then, by calculating the dense optical flow field of two adjacent frames and segmenting the dynamic region in the scene based on the dynamic threshold, the local map points projected within the dynamic regions are eliminated. On this basis, the robot's pose is optimized by minimizing the reprojection error. Moreover, a high-performance keyframe selection strategy is developed, and keyframes are inserted when the robot's pose is successfully tracked. Meanwhile, feature points are extracted and matched to the keyframes for subsequent optimization and mapping. Considering that the direct method is subject to tracking failure in practical application scenarios, the feature points and map points of keyframes are employed in robot relocation. Finally, all keyframes and map points are used as optimization variables for global bundle adjustment (BA) optimization, so as to construct a globally consistent 3D dense octree map. A series of simulations and experiments demonstrate the superior performance of the proposed algorithm.

摘要

动态物体在动态场景下的移动机器人位姿估计中会带来大量误差累积,导致无法构建与周围环境一致的地图。基于此,本文提出了一种基于稠密光流的移动机器人鲁棒半直接三维同步定位与建图(SLAM)算法。首先,使用稀疏直接法对机器人位姿进行初步估计,并利用单应性矩阵对当前帧图像进行补偿,以减少机器人运动过程中旋转引起的图像变形。然后,通过计算相邻两帧的稠密光流场,并基于动态阈值对场景中的动态区域进行分割,消除投影在动态区域内的局部地图点。在此基础上,通过最小化重投影误差对机器人位姿进行优化。此外,制定了一种高性能关键帧选择策略,在机器人位姿成功跟踪时插入关键帧。同时,提取特征点并与关键帧进行匹配,用于后续的优化和建图。考虑到直接法在实际应用场景中容易出现跟踪失败的情况,在机器人重定位时采用关键帧的特征点和地图点。最后,将所有关键帧和地图点作为优化变量进行全局束调整(BA)优化,以构建全局一致的三维稠密八叉树地图。一系列仿真和实验证明了所提算法的优越性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/6315286f96d5/biomimetics-08-00371-g016a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/e9495057e0c5/biomimetics-08-00371-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/32bb82aea3bc/biomimetics-08-00371-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/3ba7373852c9/biomimetics-08-00371-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/02eb3b318135/biomimetics-08-00371-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/9d9adf1fb305/biomimetics-08-00371-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/6aca3c5b4f48/biomimetics-08-00371-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/bec2ff68d398/biomimetics-08-00371-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/2c6bc0c92f8b/biomimetics-08-00371-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/4eac806a3e1f/biomimetics-08-00371-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/1d617184091a/biomimetics-08-00371-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/beee5e1b77b6/biomimetics-08-00371-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/149844d55ae4/biomimetics-08-00371-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/9b370a718942/biomimetics-08-00371-g013a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/978a29e70675/biomimetics-08-00371-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/339f479445db/biomimetics-08-00371-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/6315286f96d5/biomimetics-08-00371-g016a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/e9495057e0c5/biomimetics-08-00371-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/32bb82aea3bc/biomimetics-08-00371-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/3ba7373852c9/biomimetics-08-00371-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/02eb3b318135/biomimetics-08-00371-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/9d9adf1fb305/biomimetics-08-00371-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/6aca3c5b4f48/biomimetics-08-00371-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/bec2ff68d398/biomimetics-08-00371-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/2c6bc0c92f8b/biomimetics-08-00371-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/4eac806a3e1f/biomimetics-08-00371-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/1d617184091a/biomimetics-08-00371-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/beee5e1b77b6/biomimetics-08-00371-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/149844d55ae4/biomimetics-08-00371-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/9b370a718942/biomimetics-08-00371-g013a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/978a29e70675/biomimetics-08-00371-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/339f479445db/biomimetics-08-00371-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d641/10452154/6315286f96d5/biomimetics-08-00371-g016a.jpg

相似文献

1
A Robust Semi-Direct 3D SLAM for Mobile Robot Based on Dense Optical Flow in Dynamic Scenes.一种基于动态场景中密集光流的移动机器人鲁棒半直接三维同步定位与地图构建方法
Biomimetics (Basel). 2023 Aug 16;8(4):371. doi: 10.3390/biomimetics8040371.
2
DMS-SLAM: A General Visual SLAM System for Dynamic Scenes with Multiple Sensors.DMS-SLAM:一种用于多传感器动态场景的通用视觉同步定位与地图构建系统。
Sensors (Basel). 2019 Aug 27;19(17):3714. doi: 10.3390/s19173714.
3
Mobile Robot Localization and Mapping Algorithm Based on the Fusion of Image and Laser Point Cloud.基于图像与激光点云融合的移动机器人定位与地图构建算法
Sensors (Basel). 2022 May 28;22(11):4114. doi: 10.3390/s22114114.
4
AHY-SLAM: Toward Faster and More Accurate Visual SLAM in Dynamic Scenes Using Homogenized Feature Extraction and Object Detection Method.AHY-SLAM:利用匀质化特征提取和目标检测方法实现动态场景下更快更精确的视觉 SLAM。
Sensors (Basel). 2023 Apr 24;23(9):4241. doi: 10.3390/s23094241.
5
SLAM algorithm applied to robotics assistance for navigation in unknown environments.SLAM 算法在机器人辅助未知环境导航中的应用。
J Neuroeng Rehabil. 2010 Feb 17;7:10. doi: 10.1186/1743-0003-7-10.
6
YPL-SLAM: A Simultaneous Localization and Mapping Algorithm for Point-line Fusion in Dynamic Environments.YPL-SLAM:一种用于动态环境中点线融合的同步定位与地图构建算法
Sensors (Basel). 2024 Jul 12;24(14):4517. doi: 10.3390/s24144517.
7
Research on Inter-Frame Feature Mismatch Removal Method of VSLAM in Dynamic Scenes.动态场景下视觉同步定位与地图构建(VSLAM)帧间特征不匹配去除方法研究
Sensors (Basel). 2024 Feb 4;24(3):1007. doi: 10.3390/s24031007.
8
Multi-Robot Collaborative Mapping with Integrated Point-Line Features for Visual SLAM.用于视觉同步定位与地图构建的具有集成点线特征的多机器人协作映射
Sensors (Basel). 2024 Sep 4;24(17):5743. doi: 10.3390/s24175743.
9
A Novel Approach for Lidar-Based Robot Localization in a Scale-Drifted Map Constructed Using Monocular SLAM.一种在使用单目同步定位与地图构建(SLAM)构建的尺度漂移地图中基于激光雷达的机器人定位新方法。
Sensors (Basel). 2019 May 14;19(10):2230. doi: 10.3390/s19102230.
10
DOT-SLAM: A Stereo Visual Simultaneous Localization and Mapping (SLAM) System with Dynamic Object Tracking Based on Graph Optimization.DOT-SLAM:一种基于图优化的具有动态目标跟踪功能的立体视觉同步定位与地图构建(SLAM)系统。
Sensors (Basel). 2024 Jul 18;24(14):4676. doi: 10.3390/s24144676.

引用本文的文献

1
An Improved Method for Enhancing the Accuracy and Speed of Dynamic Object Detection Based on YOLOv8s.一种基于YOLOv8s提高动态目标检测精度和速度的改进方法。
Sensors (Basel). 2024 Dec 26;25(1):85. doi: 10.3390/s25010085.

本文引用的文献

1
YOLACT++ Better Real-Time Instance Segmentation.YOLACT++:更好的实时实例分割
IEEE Trans Pattern Anal Mach Intell. 2022 Feb;44(2):1108-1121. doi: 10.1109/TPAMI.2020.3014297. Epub 2022 Jan 7.
2
RGB-D SLAM in Dynamic Environments Using Point Correlations.基于点相关性的动态环境中的RGB-D同步定位与地图构建
IEEE Trans Pattern Anal Mach Intell. 2022 Jan;44(1):373-389. doi: 10.1109/TPAMI.2020.3010942. Epub 2021 Dec 7.