Suppr超能文献

多传感器移动机器人校准精度与结构稳健性的动态验证

Dynamic Validation of Calibration Accuracy and Structural Robustness of a Multi-Sensor Mobile Robot.

作者信息

Liu Yang, Cui Ximin, Fan Shenghong, Wang Qiang, Liu Yuhan, Sun Yanbiao, Wang Guo

机构信息

School of Geosciences and Surveying Engineering, China University of Mining and Technology (Beijing), Beijing 100083, China.

Beijing Prodetec Technology Co., Ltd., Beijing 100083, China.

出版信息

Sensors (Basel). 2024 Jun 16;24(12):3896. doi: 10.3390/s24123896.

Abstract

For mobile robots, the high-precision integrated calibration and structural robustness of multi-sensor systems are important prerequisites for ensuring healthy operations in the later stage. Currently, there is no well-established validation method for the calibration accuracy and structural robustness of multi-sensor systems, especially for dynamic traveling situations. This paper presents a novel validation method for the calibration accuracy and structural robustness of a multi-sensor mobile robot. The method employs a ground-object-air cooperation mechanism, termed the "ground surface simulation field (GSSF)-mobile robot -photoelectric transmitter station (PTS)". Firstly, a static high-precision GSSF is established with the true north datum as a unified reference. Secondly, a rotatable synchronous tracking system (PTS) is assembled to conduct real-time pose measurements for a mobile vehicle. The relationship between each sensor and the vehicle body is utilized to measure the dynamic pose of each sensor. Finally, the calibration accuracy and structural robustness of the sensors are dynamically evaluated. In this context, epipolar line alignment is employed to assess the accuracy of the evaluation of relative orientation calibration of binocular cameras. Point cloud projection and superposition are utilized to realize the evaluation of absolute calibration accuracy and structural robustness of individual sensors, including the navigation camera (Navcam), hazard avoidance camera (Hazcam), multispectral camera, time-of-flight depth camera (TOF), and light detection and ranging (LiDAR), with respect to the vehicle body. The experimental results demonstrate that the proposed method offers a reliable means of dynamic validation for the testing phase of a mobile robot.

摘要

对于移动机器人而言,多传感器系统的高精度集成校准和结构鲁棒性是确保后期健康运行的重要前提。目前,对于多传感器系统的校准精度和结构鲁棒性,尤其是动态行驶情况下,尚无完善的验证方法。本文提出了一种针对多传感器移动机器人校准精度和结构鲁棒性的新型验证方法。该方法采用地面物体 - 空中协作机制,即“地面模拟场(GSSF) - 移动机器人 - 光电发射站(PTS)”。首先,以真北基准作为统一参考建立静态高精度GSSF。其次,组装一个可旋转的同步跟踪系统(PTS)来对移动车辆进行实时姿态测量。利用每个传感器与车身之间的关系来测量每个传感器的动态姿态。最后,动态评估传感器的校准精度和结构鲁棒性。在此背景下,采用极线对齐来评估双目相机相对定向校准的评估精度。利用点云投影和叠加来实现对包括导航相机(Navcam)、避障相机(Hazcam)、多光谱相机、飞行时间深度相机(TOF)和激光雷达(LiDAR)在内的各个传感器相对于车身的绝对校准精度和结构鲁棒性的评估。实验结果表明,所提出的方法为移动机器人的测试阶段提供了一种可靠的动态验证手段。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2233/11207938/e6715026c065/sensors-24-03896-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验