Qiu Haiyang, Zhang Xu, Wang Hui, Xiang Dan, Xiao Mingming, Zhu Zhiyu, Wang Lei
School of Naval Architecture and Ocean Engineering, Guangzhou Maritime University, Guangzhou 510725, China.
School of Automation, Jiangsu University of Science and Technology, Zhenjiang 212013, China.
Sensors (Basel). 2023 Oct 23;23(20):8655. doi: 10.3390/s23208655.
In this paper, we propose a robust and integrated visual odometry framework exploiting the optical flow and feature point method that achieves faster pose estimate and considerable accuracy and robustness during the odometry process. Our method utilizes optical flow tracking to accelerate the feature point matching process. In the odometry, two visual odometry methods are used: global feature point method and local feature point method. When there is good optical flow tracking and enough key points optical flow tracking matching is successful, the local feature point method utilizes prior information from the optical flow to estimate relative pose transformation information. In cases where there is poor optical flow tracking and only a small number of key points successfully match, the feature point method with a filtering mechanism is used for posing estimation. By coupling and correlating the two aforementioned methods, this visual odometry greatly accelerates the computation time for relative pose estimation. It reduces the computation time of relative pose estimation to 40% of that of the ORB_SLAM3 front-end odometry, while ensuring that it is not too different from the ORB_SLAM3 front-end odometry in terms of accuracy and robustness. The effectiveness of this method was validated and analyzed using the EUROC dataset within the ORB_SLAM3 open-source framework. The experimental results serve as supporting evidence for the efficacy of the proposed approach.
在本文中,我们提出了一个强大的集成视觉里程计框架,该框架利用光流和特征点方法,在里程计过程中实现更快的位姿估计以及相当高的准确性和鲁棒性。我们的方法利用光流跟踪来加速特征点匹配过程。在里程计中,使用了两种视觉里程计方法:全局特征点方法和局部特征点方法。当存在良好的光流跟踪且有足够的关键点时,光流跟踪匹配成功,局部特征点方法利用来自光流的先验信息来估计相对位姿变换信息。在光流跟踪不佳且只有少量关键点成功匹配的情况下,使用具有滤波机制的特征点方法进行位姿估计。通过将上述两种方法进行耦合和关联,这种视觉里程计大大加快了相对位姿估计的计算时间。它将相对位姿估计的计算时间减少到ORB_SLAM3前端里程计的40%,同时确保在准确性和鲁棒性方面与ORB_SLAM3前端里程计没有太大差异。在ORB_SLAM3开源框架内使用EUROC数据集对该方法的有效性进行了验证和分析。实验结果为所提方法的有效性提供了支持证据。