• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

智能驾驶相机模型的开发与实验验证

Development and Experimental Validation of an Intelligent Camera Model for Automated Driving.

机构信息

Virtual Vehicle Research GmbH, Inffeldgasse 21a, 8010 Graz, Austria.

Department of Geography and Regional Science, University of Graz, Heinrichstraße 36, 8010 Graz, Austria.

出版信息

Sensors (Basel). 2021 Nov 15;21(22):7583. doi: 10.3390/s21227583.

DOI:10.3390/s21227583
PMID:34833657
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8622060/
Abstract

The virtual testing and validation of advanced driver assistance system and automated driving (ADAS/AD) functions require efficient and realistic perception sensor models. In particular, the limitations and measurement errors of real perception sensors need to be simulated realistically in order to generate useful sensor data for the ADAS/AD function under test. In this paper, a novel sensor modeling approach for automotive perception sensors is introduced. The novel approach combines kernel density estimation with regression modeling and puts the main focus on the position measurement errors. The modeling approach is designed for any automotive perception sensor that provides position estimations at the object level. To demonstrate and evaluate the new approach, a common state-of-the-art automotive camera (Mobileye 630) was considered. Both sensor measurements (Mobileye position estimations) and ground-truth data (DGPS positions of all attending vehicles) were collected during a large measurement campaign on a Hungarian highway to support the development and experimental validation of the new approach. The quality of the model was tested and compared to reference measurements, leading to a pointwise position error of 9.60% in the lateral and 1.57% in the longitudinal direction. Additionally, the modeling of the natural scattering of the sensor model output was satisfying. In particular, the deviations of the position measurements were well modeled with this approach.

摘要

高级驾驶员辅助系统和自动驾驶 (ADAS/AD) 功能的虚拟测试和验证需要高效且逼真的感知传感器模型。特别是,为了为测试中的 ADAS/AD 功能生成有用的传感器数据,需要真实地模拟真实感知传感器的限制和测量误差。本文介绍了一种用于汽车感知传感器的新型传感器建模方法。该新方法结合了核密度估计和回归建模,并主要关注位置测量误差。该建模方法适用于提供对象级位置估计的任何汽车感知传感器。为了演示和评估新方法,考虑了一种常见的最先进的汽车摄像头 (Mobileye 630)。在匈牙利高速公路上进行的一次大型测量活动中,同时收集了传感器测量值(Mobileye 位置估计值)和地面真实数据(所有参与车辆的 DGPS 位置),以支持新方法的开发和实验验证。模型的质量经过测试并与参考测量值进行了比较,导致横向位置误差为 9.60%,纵向位置误差为 1.57%。此外,传感器模型输出的自然散射的建模效果也令人满意。特别是,该方法很好地模拟了位置测量值的偏差。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/662fc3c0366e/sensors-21-07583-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/39ef9ef49bf2/sensors-21-07583-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/5b21a038af3f/sensors-21-07583-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/0f8682eac6aa/sensors-21-07583-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/5c1775501d69/sensors-21-07583-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/8c6ba434befd/sensors-21-07583-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/57f8814fba18/sensors-21-07583-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/355c3aa672b6/sensors-21-07583-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/db28d65d9621/sensors-21-07583-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/f4e68fd4a262/sensors-21-07583-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/38f8af29f4ea/sensors-21-07583-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/be7d4780761c/sensors-21-07583-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/3563287a9dc2/sensors-21-07583-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/aaf3f636ee2d/sensors-21-07583-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/acd1c4bfb2cf/sensors-21-07583-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/45224fa60db8/sensors-21-07583-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/9dbf053569ba/sensors-21-07583-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/662fc3c0366e/sensors-21-07583-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/39ef9ef49bf2/sensors-21-07583-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/5b21a038af3f/sensors-21-07583-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/0f8682eac6aa/sensors-21-07583-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/5c1775501d69/sensors-21-07583-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/8c6ba434befd/sensors-21-07583-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/57f8814fba18/sensors-21-07583-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/355c3aa672b6/sensors-21-07583-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/db28d65d9621/sensors-21-07583-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/f4e68fd4a262/sensors-21-07583-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/38f8af29f4ea/sensors-21-07583-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/be7d4780761c/sensors-21-07583-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/3563287a9dc2/sensors-21-07583-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/aaf3f636ee2d/sensors-21-07583-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/acd1c4bfb2cf/sensors-21-07583-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/45224fa60db8/sensors-21-07583-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/9dbf053569ba/sensors-21-07583-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e288/8622060/662fc3c0366e/sensors-21-07583-g018.jpg

相似文献

1
Development and Experimental Validation of an Intelligent Camera Model for Automated Driving.智能驾驶相机模型的开发与实验验证
Sensors (Basel). 2021 Nov 15;21(22):7583. doi: 10.3390/s21227583.
2
Automotive Lidar Modelling Approach Based on Material Properties and Lidar Capabilities.基于材料特性和激光雷达性能的汽车激光雷达建模方法
Sensors (Basel). 2020 Jun 10;20(11):3309. doi: 10.3390/s20113309.
3
Realistic 3D Simulators for Automotive: A Review of Main Applications and Features.汽车领域的逼真3D模拟器:主要应用与特性综述
Sensors (Basel). 2024 Sep 10;24(18):5880. doi: 10.3390/s24185880.
4
Development of an Energy Efficient and Cost Effective Autonomous Vehicle Research Platform.开发一个节能且具有成本效益的自动驾驶车辆研究平台。
Sensors (Basel). 2022 Aug 11;22(16):5999. doi: 10.3390/s22165999.
5
Bayesian Gaussian Mixture Models for Enhanced Radar Sensor Modeling: A Data-Driven Approach towards Sensor Simulation for ADAS/AD Development.用于增强雷达传感器建模的贝叶斯高斯混合模型:一种面向ADAS/AD开发的传感器仿真的数据驱动方法。
Sensors (Basel). 2024 Mar 28;24(7):2177. doi: 10.3390/s24072177.
6
Stabilization and Validation of 3D Object Position Using Multimodal Sensor Fusion and Semantic Segmentation.使用多模态传感器融合和语义分割技术稳定和验证三维物体位置。
Sensors (Basel). 2020 Feb 18;20(4):1110. doi: 10.3390/s20041110.
7
A Survey on Modelling of Automotive Radar Sensors for Virtual Test and Validation of Automated Driving.用于自动驾驶虚拟测试与验证的汽车雷达传感器建模研究
Sensors (Basel). 2022 Jul 29;22(15):5693. doi: 10.3390/s22155693.
8
A Survey on Ground Segmentation Methods for Automotive LiDAR Sensors.汽车激光雷达传感器地面分割方法研究综述。
Sensors (Basel). 2023 Jan 5;23(2):601. doi: 10.3390/s23020601.
9
Research Scenarios of Autonomous Vehicles, the Sensors and Measurement Systems Used in Experiments.自动驾驶汽车的研究场景,实验中使用的传感器和测量系统。
Sensors (Basel). 2022 Aug 31;22(17):6586. doi: 10.3390/s22176586.
10
A Novel Approach for Simulation of Automotive Radar Sensors Designed for Systematic Support of Vehicle Development.一种用于汽车雷达传感器仿真的新方法,旨在为车辆开发提供系统支持。
Sensors (Basel). 2023 Mar 17;23(6):3227. doi: 10.3390/s23063227.

引用本文的文献

1
Bayesian Gaussian Mixture Models for Enhanced Radar Sensor Modeling: A Data-Driven Approach towards Sensor Simulation for ADAS/AD Development.用于增强雷达传感器建模的贝叶斯高斯混合模型:一种面向ADAS/AD开发的传感器仿真的数据驱动方法。
Sensors (Basel). 2024 Mar 28;24(7):2177. doi: 10.3390/s24072177.
2
Multisensory Testing Framework for Advanced Driver Assistant Systems Supported by High-Quality 3D Simulation.多感官测试框架,用于支持高质量 3D 仿真的先进驾驶员辅助系统。
Sensors (Basel). 2021 Dec 18;21(24):8458. doi: 10.3390/s21248458.

本文引用的文献

1
Configurable Sensor Model Architecture for the Development of Automated Driving Systems.可配置传感器模型架构,用于开发自动驾驶系统。
Sensors (Basel). 2021 Jul 8;21(14):4687. doi: 10.3390/s21144687.
2
Motorway Measurement Campaign to Support R&D Activities in the Field of Automated Driving Technologies.高速公路测量活动助力自动驾驶技术研发。
Sensors (Basel). 2021 Mar 19;21(6):2169. doi: 10.3390/s21062169.
3
The ApolloScape Open Dataset for Autonomous Driving and Its Application.阿波罗景观开放数据集在自动驾驶中的应用
IEEE Trans Pattern Anal Mach Intell. 2020 Oct;42(10):2702-2719. doi: 10.1109/TPAMI.2019.2926463. Epub 2019 Jul 2.
4
Real time speed estimation of moving vehicles from side view images from an uncalibrated video camera.从非校准视频摄像机的侧视图图像实时估计移动车辆的速度。
Sensors (Basel). 2010;10(5):4805-24. doi: 10.3390/s100504805. Epub 2010 May 11.