Luo Lishu, Peng Fulun, Dong Longhui
Xi'an Institute of Applied Optics, Xi'an 710065, China.
Sensors (Basel). 2024 Sep 25;24(19):6193. doi: 10.3390/s24196193.
High-precision simultaneous localization and mapping (SLAM) in dynamic real-world environments plays a crucial role in autonomous robot navigation, self-driving cars, and drone control. To address this dynamic localization issue, in this paper, a dynamic odometry method is proposed based on FAST-LIVO, a fast LiDAR (light detection and ranging)-inertial-visual odometry system, integrating neural networks with laser, camera, and inertial measurement unit modalities. The method first constructs visual-inertial and LiDAR-inertial odometry subsystems. Then, a lightweight neural network is used to remove dynamic elements from the visual part, and dynamic clustering is applied to the LiDAR part to eliminate dynamic environments, ensuring the reliability of the remaining environmental data. Validation of the datasets shows that the proposed multi-sensor fusion dynamic odometry can achieve high-precision pose estimation in complex dynamic environments with high continuity, reliability, and dynamic robustness.
在动态现实环境中的高精度同步定位与地图构建(SLAM)在自主机器人导航、自动驾驶汽车和无人机控制中起着至关重要的作用。为了解决这一动态定位问题,本文提出了一种基于FAST-LIVO的动态里程计方法,FAST-LIVO是一种快速激光雷达(光探测和测距)-惯性-视觉里程计系统,将神经网络与激光、相机和惯性测量单元模态相结合。该方法首先构建视觉惯性和激光雷达惯性里程计子系统。然后,使用轻量级神经网络从视觉部分去除动态元素,并对激光雷达部分应用动态聚类以消除动态环境,确保剩余环境数据的可靠性。数据集验证表明,所提出的多传感器融合动态里程计能够在复杂动态环境中实现具有高连续性、可靠性和动态鲁棒性的高精度位姿估计。