• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于平面补丁的高效三维激光雷达里程计

Efficient 3D Lidar Odometry Based on Planar Patches.

作者信息

Galeote-Luque Andres, Ruiz-Sarmiento Jose-Raul, Gonzalez-Jimenez Javier

机构信息

Machine Perception and Intelligent Robotics Group (MAPIR-UMA), Malaga Institute for Mechatronics Engineering and Cyber-Physical Systems (IMECH.UMA), University of Malaga, 29071 Malaga, Spain.

出版信息

Sensors (Basel). 2022 Sep 15;22(18):6976. doi: 10.3390/s22186976.

DOI:10.3390/s22186976
PMID:36146325
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9502187/
Abstract

In this paper we present a new way to compute the odometry of a 3D lidar in real-time. Due to the significant relation between these sensors and the rapidly increasing sector of autonomous vehicles, 3D lidars have improved in recent years, with modern models producing data in the form of range images. We take advantage of this ordered format to efficiently estimate the trajectory of the sensor as it moves in 3D space. The proposed method creates and leverages a flatness image in order to exploit the information found in flat surfaces of the scene. This allows for an efficient selection of planar patches from a first range image. Then, from a second image, keypoints related to said patches are extracted. This way, our proposal computes the ego-motion by imposing a coplanarity constraint between pairs <point, plane> whose correspondences are iteratively updated. The proposed algorithm is tested and compared with state-of-the-art ICP algorithms. Experiments show that our proposal, running on a single thread, can run 5× faster than a multi-threaded implementation of GICP, while providing a more accurate localization. A second version of the algorithm is also presented, which reduces the drift even further while needing less than half of the computation time of GICP. Both configurations of the algorithm run at frame rates common for most 3D lidars, 10 and 20 Hz on a standard CPU.

摘要

在本文中,我们提出了一种实时计算三维激光雷达里程计的新方法。由于这些传感器与快速发展的自动驾驶汽车领域有着密切的关系,近年来三维激光雷达有了改进,现代型号以距离图像的形式产生数据。我们利用这种有序格式来有效地估计传感器在三维空间中移动时的轨迹。所提出的方法创建并利用一个平面度图像,以便利用场景平面表面中发现的信息。这允许从第一距离图像中有效地选择平面块。然后,从第二幅图像中提取与所述块相关的关键点。通过这种方式,我们的方案通过在对应关系被迭代更新的<点,平面>对之间施加共面约束来计算自我运动。所提出的算法经过测试,并与最先进的ICP算法进行了比较。实验表明,我们的方案在单线程上运行时,速度比GICP的多线程实现快5倍,同时提供更精确的定位。还提出了该算法的第二个版本,它进一步减少了漂移,同时所需的计算时间不到GICP的一半。该算法的两种配置都以大多数三维激光雷达常见的帧率运行,在标准CPU上为10赫兹和20赫兹。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/86c9863265b0/sensors-22-06976-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/dc20299deee0/sensors-22-06976-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/31fd376f2fc2/sensors-22-06976-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/28eb60c61ba1/sensors-22-06976-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/a5028c6bd377/sensors-22-06976-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/772f49dad38c/sensors-22-06976-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/86c9863265b0/sensors-22-06976-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/dc20299deee0/sensors-22-06976-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/31fd376f2fc2/sensors-22-06976-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/28eb60c61ba1/sensors-22-06976-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/a5028c6bd377/sensors-22-06976-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/772f49dad38c/sensors-22-06976-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2994/9502187/86c9863265b0/sensors-22-06976-g006.jpg

相似文献

1
Efficient 3D Lidar Odometry Based on Planar Patches.基于平面补丁的高效三维激光雷达里程计
Sensors (Basel). 2022 Sep 15;22(18):6976. doi: 10.3390/s22186976.
2
Robust GICP-Based 3D LiDAR SLAM for Underground Mining Environment.用于地下采矿环境的基于稳健广义迭代最近点算法的三维激光雷达同步定位与地图构建
Sensors (Basel). 2019 Jul 1;19(13):2915. doi: 10.3390/s19132915.
3
Multi-LiDAR Mapping for Scene Segmentation in Indoor Environments for Mobile Robots.多激光雷达在移动机器人室内环境场景分割中的应用。
Sensors (Basel). 2022 May 12;22(10):3690. doi: 10.3390/s22103690.
4
Analysis of Lidar Actuator System Influence on the Quality of Dense 3D Point Cloud Obtained with SLAM.分析激光雷达执行器系统对 SLAM 获得的密集三维点云质量的影响。
Sensors (Basel). 2023 Jan 8;23(2):721. doi: 10.3390/s23020721.
5
Extrinsic Calibration of Multiple 3D LiDAR Sensors by the Use of Planar Objects.利用平面物体对多个三维激光雷达传感器进行外部校准
Sensors (Basel). 2022 Sep 23;22(19):7234. doi: 10.3390/s22197234.
6
Enhancing Solid State LiDAR Mapping with a 2D Spinning LiDAR in Urban Scenario SLAM on Ground Vehicles.在地面车辆的城市场景同步定位与地图构建中,利用二维旋转激光雷达增强固态激光雷达测绘
Sensors (Basel). 2021 Mar 4;21(5):1773. doi: 10.3390/s21051773.
7
Robust Lidar-Inertial Odometry with Ground Condition Perception and Optimization Algorithm for UGV.基于地面状况感知和优化算法的 UGV 鲁棒激光雷达惯性里程计
Sensors (Basel). 2022 Sep 29;22(19):7424. doi: 10.3390/s22197424.
8
Optimized LOAM Using Ground Plane Constraints and SegMatch-Based Loop Detection.利用地平面约束和基于 SegMatch 的回环检测优化 LOAM。
Sensors (Basel). 2019 Dec 9;19(24):5419. doi: 10.3390/s19245419.
9
SLAM and 3D Semantic Reconstruction Based on the Fusion of Lidar and Monocular Vision.基于激光雷达和单目视觉融合的 SLAM 和 3D 语义重建。
Sensors (Basel). 2023 Jan 29;23(3):1502. doi: 10.3390/s23031502.
10
Pronto: A Multi-Sensor State Estimator for Legged Robots in Real-World Scenarios.Pronto:用于现实场景中四足机器人的多传感器状态估计器
Front Robot AI. 2020 Jun 5;7:68. doi: 10.3389/frobt.2020.00068. eCollection 2020.

引用本文的文献

1
Advanced Sensors Technologies Applied in Mobile Robot.先进传感器技术在移动机器人中的应用。
Sensors (Basel). 2023 Mar 8;23(6):2958. doi: 10.3390/s23062958.

本文引用的文献

1
Occlusion and the solution to the aperture problem for motion.遮挡与运动孔径问题的解决方案。
Vision Res. 1989;29(5):619-26. doi: 10.1016/0042-6989(89)90047-3.