• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于动态场景的激光雷达-360度RGB相机-360度热成像相机无靶标校准

LiDAR-360 RGB Camera-360 Thermal Camera Targetless Calibration for Dynamic Situations.

作者信息

Tran Khanh Bao, Carballo Alexander, Takeda Kazuya

机构信息

Graduate School of Informatics, Nagoya University, Furo-cho, Chikusa-ku, Nagoya 464-8601, Japan.

Faculty of Engineering and Graduate School of Engineering, Gifu University, 1-1 Yanagido, Gifu City 501-1193, Japan.

出版信息

Sensors (Basel). 2024 Nov 10;24(22):7199. doi: 10.3390/s24227199.

DOI:10.3390/s24227199
PMID:39598976
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11598782/
Abstract

Integrating multiple types of sensors into autonomous systems, such as cars and robots, has become a widely adopted approach in modern technology. Among these sensors, RGB cameras, thermal cameras, and LiDAR are particularly valued for their ability to provide comprehensive environmental data. However, despite their advantages, current research primarily focuses on the one or combination of two sensors at a time. The full potential of utilizing all three sensors is often neglected. One key challenge is the ego-motion compensation of data in dynamic situations, which results from the rotational nature of the LiDAR sensor, and the blind spots of standard cameras due to their limited field of view. To resolve this problem, this paper proposes a novel method for the simultaneous registration of LiDAR, panoramic RGB cameras, and panoramic thermal cameras in dynamic environments without the need for calibration targets. Initially, essential features from RGB images, thermal data, and LiDAR point clouds are extracted through a novel method, designed to capture significant raw data characteristics. These extracted features then serve as a foundation for ego-motion compensation, optimizing the initial dataset. Subsequently, the raw features can be further refined to enhance calibration accuracy, achieving more precise alignment results. The results of the paper demonstrate the effectiveness of this approach in enhancing multiple sensor calibration compared to other ways. In the case of a high speed of around 9 m/s, some situations can improve the accuracy about 30 percent higher for LiDAR and Camera calibration. The proposed method has the potential to significantly improve the reliability and accuracy of autonomous systems in real-world scenarios, particularly under challenging environmental conditions.

摘要

将多种类型的传感器集成到自动驾驶系统中,如汽车和机器人,已成为现代技术中一种广泛采用的方法。在这些传感器中,RGB相机、热成像相机和激光雷达因其能够提供全面的环境数据而备受重视。然而,尽管它们具有优势,但目前的研究主要一次聚焦于一种传感器或两种传感器的组合。同时利用所有三种传感器的全部潜力常常被忽视。一个关键挑战是动态情况下数据的自我运动补偿,这是由激光雷达传感器的旋转特性以及标准相机由于有限视野而存在的盲点导致的。为了解决这个问题,本文提出了一种在动态环境中同时配准激光雷达、全景RGB相机和全景热成像相机的新方法,无需校准目标。首先,通过一种新颖的方法从RGB图像、热数据和激光雷达点云中提取基本特征,该方法旨在捕捉重要的原始数据特征。这些提取的特征随后作为自我运动补偿的基础,优化初始数据集。随后,可以进一步细化原始特征以提高校准精度,从而获得更精确的对齐结果。本文的结果证明了这种方法在增强多传感器校准方面相对于其他方法的有效性。在速度约为9米/秒的情况下,某些情况下激光雷达和相机校准的精度可提高约30%。所提出的方法有可能显著提高自动驾驶系统在现实场景中的可靠性和准确性,特别是在具有挑战性的环境条件下。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/f5c388db6369/sensors-24-07199-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/e422391257a8/sensors-24-07199-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/7c8dd622be44/sensors-24-07199-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/55efa8bd04a8/sensors-24-07199-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/214a2ea2462e/sensors-24-07199-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/d97867cddc80/sensors-24-07199-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/72c820985a11/sensors-24-07199-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/aa2193827726/sensors-24-07199-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/5fcaa2126d99/sensors-24-07199-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/845e0a0c4e25/sensors-24-07199-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/7daa645cbafc/sensors-24-07199-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/e28726163346/sensors-24-07199-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/17cbdc84696a/sensors-24-07199-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/1f96f1d7bc1f/sensors-24-07199-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/5cf9bbac5d09/sensors-24-07199-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/a8f163fab5fb/sensors-24-07199-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/a2ecbdd3f6a4/sensors-24-07199-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/9faaa2eda854/sensors-24-07199-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/72ce5ee65705/sensors-24-07199-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/79389b0a6a5c/sensors-24-07199-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/d58e68d37ab2/sensors-24-07199-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/380877e69372/sensors-24-07199-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/7270da37fa0b/sensors-24-07199-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/9d1be13e3efe/sensors-24-07199-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/52e843220a69/sensors-24-07199-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/eff6bbd4491b/sensors-24-07199-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/830396833d86/sensors-24-07199-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/6ff26d4734ad/sensors-24-07199-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/f5c388db6369/sensors-24-07199-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/e422391257a8/sensors-24-07199-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/7c8dd622be44/sensors-24-07199-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/55efa8bd04a8/sensors-24-07199-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/214a2ea2462e/sensors-24-07199-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/d97867cddc80/sensors-24-07199-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/72c820985a11/sensors-24-07199-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/aa2193827726/sensors-24-07199-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/5fcaa2126d99/sensors-24-07199-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/845e0a0c4e25/sensors-24-07199-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/7daa645cbafc/sensors-24-07199-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/e28726163346/sensors-24-07199-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/17cbdc84696a/sensors-24-07199-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/1f96f1d7bc1f/sensors-24-07199-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/5cf9bbac5d09/sensors-24-07199-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/a8f163fab5fb/sensors-24-07199-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/a2ecbdd3f6a4/sensors-24-07199-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/9faaa2eda854/sensors-24-07199-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/72ce5ee65705/sensors-24-07199-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/79389b0a6a5c/sensors-24-07199-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/d58e68d37ab2/sensors-24-07199-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/380877e69372/sensors-24-07199-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/7270da37fa0b/sensors-24-07199-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/9d1be13e3efe/sensors-24-07199-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/52e843220a69/sensors-24-07199-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/eff6bbd4491b/sensors-24-07199-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/830396833d86/sensors-24-07199-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/6ff26d4734ad/sensors-24-07199-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9316/11598782/f5c388db6369/sensors-24-07199-g028.jpg

相似文献

1
LiDAR-360 RGB Camera-360 Thermal Camera Targetless Calibration for Dynamic Situations.用于动态场景的激光雷达-360度RGB相机-360度热成像相机无靶标校准
Sensors (Basel). 2024 Nov 10;24(22):7199. doi: 10.3390/s24227199.
2
Adaptive Point-Line Fusion: A Targetless LiDAR-Camera Calibration Method with Scheme Selection for Autonomous Driving.自适应点线融合:一种用于自动驾驶的具有方案选择的无目标激光雷达-相机校准方法
Sensors (Basel). 2024 Feb 8;24(4):1127. doi: 10.3390/s24041127.
3
Accurate Calibration of Multi-LiDAR-Multi-Camera Systems.多激光雷达-多相机系统的精确标定。
Sensors (Basel). 2018 Jul 3;18(7):2139. doi: 10.3390/s18072139.
4
Semantic Fusion Algorithm of 2D LiDAR and Camera Based on Contour and Inverse Projection.基于轮廓和逆投影的二维激光雷达与相机语义融合算法
Sensors (Basel). 2025 Apr 17;25(8):2526. doi: 10.3390/s25082526.
5
Automatic Extrinsic Calibration of 3D LIDAR and Multi-Cameras Based on Graph Optimization.基于图优化的三维激光雷达与多相机自动外部校准
Sensors (Basel). 2022 Mar 13;22(6):2221. doi: 10.3390/s22062221.
6
Extrinsic Calibration between Camera and LiDAR Sensors by Matching Multiple 3D Planes.相机和激光雷达传感器的外部校准通过匹配多个 3D 平面。
Sensors (Basel). 2019 Dec 20;20(1):52. doi: 10.3390/s20010052.
7
Multi-Level Optimization for Data-Driven Camera-LiDAR Calibration in Data Collection Vehicles.数据采集车辆中基于数据驱动的相机-激光雷达校准的多级优化
Sensors (Basel). 2023 Nov 1;23(21):8889. doi: 10.3390/s23218889.
8
RLCFormer: Automatic roadside LiDAR-Camera calibration framework with transformer.RLCFormer:基于Transformer的自动路边激光雷达-相机校准框架
Heliyon. 2024 Sep 26;10(20):e38506. doi: 10.1016/j.heliyon.2024.e38506. eCollection 2024 Oct 30.
9
Line-Based Registration of Panoramic Images and LiDAR Point Clouds for Mobile Mapping.用于移动测绘的全景图像与激光雷达点云的基于线的配准
Sensors (Basel). 2016 Dec 31;17(1):70. doi: 10.3390/s17010070.
10
A Review of Deep Learning-Based LiDAR and Camera Extrinsic Calibration.基于深度学习的激光雷达与相机外部校准综述
Sensors (Basel). 2024 Jun 15;24(12):3878. doi: 10.3390/s24123878.

引用本文的文献

1
T360Fusion: Temporal 360 Multimodal Fusion for 3D Object Detection via Transformers.T360融合:通过Transformer实现用于3D目标检测的时域360多模态融合
Sensors (Basel). 2025 Aug 8;25(16):4902. doi: 10.3390/s25164902.