Yang Tianyu, Wei Shuangfeng, Nan Jingxuan, Li Mingyang, Li Mingrui
School of Geomatics and Urban Spatial Informatics, Beijing University of Civil Engineering and Architecture, Beijing 102616, China.
Research Center of Representative Building and Architectural Heritage Database, Ministry of Education, Beijing 102616, China.
Sensors (Basel). 2025 Oct 30;25(21):6641. doi: 10.3390/s25216641.
Simultaneous Localization and Mapping (SLAM) utilizes sensor data to concurrently construct environmental maps and estimate its own position, finding wide application in scenarios like robotic navigation and augmented reality. SLAM systems based on 3D Gaussian Splatting (3DGS) have garnered significant attention due to their real-time, high-fidelity rendering capabilities. However, in real-world environments containing dynamic objects, existing 3DGS-SLAM methods often suffer from mapping errors and tracking drift due to dynamic interference. To address this challenge, this paper proposes BDGS-SLAM-a Bayesian Dynamic Gaussian Splatting SLAM framework specifically designed for dynamic environments. During the tracking phase, the system integrates semantic detection results from YOLOv5 to build a dynamic prior probability model based on Bayesian filtering, enabling accurate identification of dynamic Gaussians. In the mapping phase, a multi-view probabilistic update mechanism is employed, which aggregates historical observation information from co-visible keyframes. By introducing an exponential decay factor to dynamically adjust weights, this mechanism effectively restores static Gaussians that were mistakenly culled. Furthermore, an adaptive dynamic Gaussian optimization strategy is proposed. This strategy applies penalizing constraints to suppress the negative impact of dynamic Gaussians on rendering while avoiding the erroneous removal of static Gaussians and ensuring the integrity of critical scene information. Experimental results demonstrate that, compared to baseline methods, BDGS-SLAM achieves comparable tracking accuracy while generating fewer artifacts in rendered results and realizing higher-fidelity scene reconstruction.
同步定位与地图构建(SLAM)利用传感器数据同时构建环境地图并估计自身位置,在机器人导航和增强现实等场景中有着广泛应用。基于三维高斯点云渲染(3DGS)的SLAM系统因其实时、高保真渲染能力而备受关注。然而,在包含动态物体的现实环境中,现有的3DGS-SLAM方法常常因动态干扰而出现映射错误和跟踪漂移。为应对这一挑战,本文提出了BDGS-SLAM——一种专门为动态环境设计的贝叶斯动态高斯点云渲染SLAM框架。在跟踪阶段,系统整合来自YOLOv5的语义检测结果,基于贝叶斯滤波构建动态先验概率模型,从而能够准确识别动态高斯点。在映射阶段,采用多视图概率更新机制,该机制聚合来自共视关键帧的历史观测信息。通过引入指数衰减因子动态调整权重,此机制有效恢复了被误剔除的静态高斯点。此外,还提出了一种自适应动态高斯优化策略。该策略应用惩罚约束来抑制动态高斯点对渲染的负面影响,同时避免误删静态高斯点并确保关键场景信息的完整性。实验结果表明,与基线方法相比,BDGS-SLAM在实现相当跟踪精度的同时,渲染结果中产生的伪影更少,且实现了更高保真度的场景重建。