Wu Yunhao, Zhang Ziyao, Chen Haifeng, Li Jian
College of Electronic Information and Artificial Intelligence, Shaanxi University of Science and Technology, Xi'an 710021, China.
School of Physics, Peking University, Beijing 100871, China.
Sensors (Basel). 2025 Aug 10;25(16):4952. doi: 10.3390/s25164952.
In GNSS-deprived settings, such as indoor and underground environments, research on simultaneous localization and mapping (SLAM) technology remains a focal point. Addressing the influence of dynamic variables on positional precision and constructing a persistent map comprising solely static elements are pivotal objectives in visual SLAM for dynamic scenes. This paper introduces optical flow motion segmentation-based SLAM(OS-SLAM), a dynamic environment SLAM system that incorporates optical flow motion segmentation for enhanced robustness. Initially, a lightweight multi-scale optical flow network is developed and optimized using multi-scale feature extraction and update modules to enhance motion segmentation accuracy with rigid masks while maintaining real-time performance. Subsequently, a novel fusion approach combining the YOLO-fastest method and Rigidmask fusion is proposed to mitigate mis-segmentation errors of static backgrounds caused by non-rigid moving objects. Finally, a static dense point cloud map is generated by filtering out abnormal point clouds. OS-SLAM integrates optical flow estimation with motion segmentation to effectively reduce the impact of dynamic objects. Experimental findings from the Technical University of Munich (TUM) dataset demonstrate that the proposed method significantly outperforms ORB-SLAM3 in handling high dynamic sequences, achieving a reduction of 91.2% in absolute position error (APE) and 45.1% in relative position error (RPE) on average.
在诸如室内和地下环境等全球导航卫星系统(GNSS)信号缺失的场景中,同步定位与地图构建(SLAM)技术的研究仍然是一个焦点。解决动态变量对位置精度的影响以及构建仅包含静态元素的持久地图是动态场景视觉SLAM中的关键目标。本文介绍了基于光流运动分割的SLAM(OS-SLAM),这是一种动态环境SLAM系统,它结合了光流运动分割以提高鲁棒性。首先,开发并优化了一个轻量级多尺度光流网络,使用多尺度特征提取和更新模块,以在保持实时性能的同时,通过刚性掩码提高运动分割精度。随后,提出了一种结合YOLO-fastest方法和Rigidmask融合的新型融合方法,以减轻非刚性移动物体导致的静态背景误分割误差。最后,通过滤除异常点云生成静态密集点云地图。OS-SLAM将光流估计与运动分割相结合,有效减少了动态物体的影响。慕尼黑工业大学(TUM)数据集的实验结果表明,该方法在处理高动态序列方面明显优于ORB-SLAM3,平均绝对位置误差(APE)降低了91.2%,相对位置误差(RPE)降低了45.1%。