• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于低通道激光雷达和相机传感器融合的自动驾驶车辆最近路径车辆估计。

Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles.

机构信息

Daegu Gyeongbuk Institute of Science & Technology (DGIST), College of Transdisciplinary Studies, Daegu 333, Korea.

Department of Interdisciplinary Engineering, Daegu Gyeongbuk Institute of Science & Technology (DGIST), Daegu 333, Korea.

出版信息

Sensors (Basel). 2021 Apr 30;21(9):3124. doi: 10.3390/s21093124.

DOI:10.3390/s21093124
PMID:33946282
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8125378/
Abstract

In autonomous driving, using a variety of sensors to recognize preceding vehicles at middle and long distances is helpful for improving driving performance and developing various functions. However, if only LiDAR or cameras are used in the recognition stage, it is difficult to obtain the necessary data due to the limitations of each sensor. In this paper, we proposed a method of converting the vision-tracked data into bird's eye-view (BEV) coordinates using an equation that projects LiDAR points onto an image and a method of fusion between LiDAR and vision-tracked data. Thus, the proposed method was effective through the results of detecting the closest in-path vehicle (CIPV) in various situations. In addition, even when experimenting with the EuroNCAP autonomous emergency braking (AEB) test protocol using the result of fusion, AEB performance was improved through improved cognitive performance than when using only LiDAR. In the experimental results, the performance of the proposed method was proven through actual vehicle tests in various scenarios. Consequently, it was convincing that the proposed sensor fusion method significantly improved the adaptive cruise control (ACC) function in autonomous maneuvering. We expect that this improvement in perception performance will contribute to improving the overall stability of ACC.

摘要

在自动驾驶中,使用各种传感器来识别中远距离的前车有助于提高驾驶性能和开发各种功能。然而,如果仅在识别阶段使用激光雷达或摄像头,由于每个传感器的局限性,很难获得必要的数据。在本文中,我们提出了一种使用将激光雷达点投影到图像上的方程将视觉跟踪数据转换为鸟瞰图 (BEV) 坐标的方法,以及一种激光雷达和视觉跟踪数据融合的方法。因此,通过在各种情况下检测最近的路径内车辆 (CIPV) 的结果证明了该方法的有效性。此外,即使在使用融合结果进行 EuroNCAP 自动紧急制动 (AEB) 测试协议的实验中,通过提高认知性能也比仅使用激光雷达提高了 AEB 性能。在实验结果中,通过在各种场景下的实际车辆测试证明了所提出方法的性能。因此,可以肯定的是,所提出的传感器融合方法显著提高了自动驾驶中的自适应巡航控制 (ACC) 功能。我们期望这种感知性能的提高将有助于提高 ACC 的整体稳定性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/4962493a6fc9/sensors-21-03124-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/83b653d88970/sensors-21-03124-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/bb32182f1edd/sensors-21-03124-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/56feb4aaf109/sensors-21-03124-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/83512a114780/sensors-21-03124-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/8ebaa4cd9a24/sensors-21-03124-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/a3c81169888f/sensors-21-03124-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/23ae03fc3fc7/sensors-21-03124-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/82adb4945680/sensors-21-03124-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/4962493a6fc9/sensors-21-03124-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/83b653d88970/sensors-21-03124-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/bb32182f1edd/sensors-21-03124-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/56feb4aaf109/sensors-21-03124-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/83512a114780/sensors-21-03124-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/8ebaa4cd9a24/sensors-21-03124-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/a3c81169888f/sensors-21-03124-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/23ae03fc3fc7/sensors-21-03124-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/82adb4945680/sensors-21-03124-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/71a8/8125378/4962493a6fc9/sensors-21-03124-g007.jpg

相似文献

1
Estimation of the Closest In-Path Vehicle by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicles.基于低通道激光雷达和相机传感器融合的自动驾驶车辆最近路径车辆估计。
Sensors (Basel). 2021 Apr 30;21(9):3124. doi: 10.3390/s21093124.
2
Fast vehicle detection based on colored point cloud with bird's eye view representation.基于鸟瞰彩色点云的快速车辆检测。
Sci Rep. 2023 May 8;13(1):7447. doi: 10.1038/s41598-023-34479-z.
3
On the Development of Autonomous Vehicle Safety Distance by an RSS Model Based on a Variable Focus Function Camera.基于可变焦距函数相机的RSS模型的自动驾驶车辆安全距离开发
Sensors (Basel). 2021 Oct 11;21(20):6733. doi: 10.3390/s21206733.
4
Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review.自动驾驶车辆中的传感器与传感器融合技术:综述。
Sensors (Basel). 2021 Mar 18;21(6):2140. doi: 10.3390/s21062140.
5
Multitarget-Tracking Method Based on the Fusion of Millimeter-Wave Radar and LiDAR Sensor Information for Autonomous Vehicles.基于毫米波雷达与激光雷达传感器信息融合的自动驾驶车辆多目标跟踪方法
Sensors (Basel). 2023 Aug 3;23(15):6920. doi: 10.3390/s23156920.
6
Free Space Detection Using Camera-LiDAR Fusion in a Bird's Eye View Plane.基于鸟瞰面相机-激光雷达融合的自由空间检测。
Sensors (Basel). 2021 Nov 17;21(22):7623. doi: 10.3390/s21227623.
7
Investigating the Improvement of Autonomous Vehicle Performance through the Integration of Multi-Sensor Dynamic Mapping Techniques.研究通过多传感器动态映射技术的集成来提高自动驾驶汽车的性能。
Sensors (Basel). 2023 Feb 21;23(5):2369. doi: 10.3390/s23052369.
8
Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots.激光雷达和广角相机数据的稳健融合在自主移动机器人中的应用。
Sensors (Basel). 2018 Aug 20;18(8):2730. doi: 10.3390/s18082730.
9
High Definition 3D Map Creation Using GNSS/IMU/LiDAR Sensor Integration to Support Autonomous Vehicle Navigation.利用全球导航卫星系统/惯性测量单元/激光雷达传感器集成创建高清3D地图以支持自动驾驶车辆导航。
Sensors (Basel). 2020 Feb 7;20(3):899. doi: 10.3390/s20030899.
10
Delving Into the Devils of Bird's-Eye-View Perception: A Review, Evaluation and Recipe.深入探究鸟瞰视角感知的难题:一篇综述、评估与方法
IEEE Trans Pattern Anal Mach Intell. 2024 Apr;46(4):2151-2170. doi: 10.1109/TPAMI.2023.3333838. Epub 2024 Mar 6.

引用本文的文献

1
Study on Multi-Heterogeneous Sensor Data Fusion Method Based on Millimeter-Wave Radar and Camera.基于毫米波雷达和相机的多异类传感器数据融合方法研究。
Sensors (Basel). 2023 Jun 29;23(13):6044. doi: 10.3390/s23136044.
2
Inter-row information recognition of maize in the middle and late stages LiDAR supplementary vision.玉米中后期行间距信息识别的激光雷达辅助视觉
Front Plant Sci. 2022 Dec 1;13:1024360. doi: 10.3389/fpls.2022.1024360. eCollection 2022.
3
Research Scenarios of Autonomous Vehicles, the Sensors and Measurement Systems Used in Experiments.

本文引用的文献

1
Calibration between color camera and 3D LIDAR instruments with a polygonal planar board.使用多边形平面板实现彩色相机与三维激光雷达仪器之间的校准。
Sensors (Basel). 2014 Mar 17;14(3):5333-53. doi: 10.3390/s140305333.
自动驾驶汽车的研究场景,实验中使用的传感器和测量系统。
Sensors (Basel). 2022 Aug 31;22(17):6586. doi: 10.3390/s22176586.