• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种用于精确车辆定位和自动驾驶应用的低成本传感器方法。

A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application.

作者信息

Vivacqua Rafael, Vassallo Raquel, Martins Felipe

机构信息

Federal Institute of Education, Science and Technology of Espirito Santo, Serra ES 29173-087, Brazil.

Department of Electrical Engineering, Federal University of Espirito Santo, Vitória ES 29075-910, Brazil.

出版信息

Sensors (Basel). 2017 Oct 16;17(10):2359. doi: 10.3390/s17102359.

DOI:10.3390/s17102359
PMID:29035334
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5676663/
Abstract

Autonomous driving in public roads requires precise localization within the range of few centimeters. Even the best current precise localization system based on the Global Navigation Satellite System (GNSS) can not always reach this level of precision, especially in an urban environment, where the signal is disturbed by surrounding buildings and artifacts. Laser range finder and stereo vision have been successfully used for obstacle detection, mapping and localization to solve the autonomous driving problem. Unfortunately, Light Detection and Ranging (LIDARs) are very expensive sensors and stereo vision requires powerful dedicated hardware to process the cameras information. In this context, this article presents a low-cost architecture of sensors and data fusion algorithm capable of autonomous driving in narrow two-way roads. Our approach exploits a combination of a short-range visual lane marking detector and a dead reckoning system to build a long and precise perception of the lane markings in the vehicle's backwards. This information is used to localize the vehicle in a map, that also contains the reference trajectory for autonomous driving. Experimental results show the successful application of the proposed system on a real autonomous driving situation.

摘要

在公共道路上实现自动驾驶需要在几厘米的范围内进行精确的定位。即使是目前基于全球导航卫星系统(GNSS)的最佳精确定位系统,也并非总能达到这种精度水平,尤其是在城市环境中,信号会受到周围建筑物和人工制品的干扰。激光测距仪和立体视觉已成功用于障碍物检测、地图绘制和定位,以解决自动驾驶问题。不幸的是,激光雷达(LIDARs)是非常昂贵的传感器,而立体视觉需要强大的专用硬件来处理相机信息。在此背景下,本文提出了一种低成本的传感器架构和数据融合算法,能够在狭窄的双向道路上实现自动驾驶。我们的方法利用了短程视觉车道标记检测器和航位推算系统的组合,以建立车辆后方车道标记的长距离精确感知。此信息用于在地图中定位车辆,该地图还包含自动驾驶的参考轨迹。实验结果表明,该系统在实际自动驾驶场景中得到了成功应用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/d54e91cf3b67/sensors-17-02359-g032.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/a823b98db578/sensors-17-02359-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/c8d05cdb1b1a/sensors-17-02359-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/ef3362d48dc1/sensors-17-02359-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/14acce073d83/sensors-17-02359-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/a31902820c08/sensors-17-02359-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/6545e543a712/sensors-17-02359-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/07fd70b6c519/sensors-17-02359-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/7f8b49a11e3b/sensors-17-02359-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/290c33f3a725/sensors-17-02359-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/d057afc84497/sensors-17-02359-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/813bae50fed1/sensors-17-02359-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/254595792834/sensors-17-02359-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/703ef4f0f149/sensors-17-02359-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/b72b71b50692/sensors-17-02359-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/2e79fdc94e5e/sensors-17-02359-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/bb40ee91751a/sensors-17-02359-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/3c8f833da028/sensors-17-02359-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/980d8b4680b0/sensors-17-02359-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/898aed83b401/sensors-17-02359-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/2693c15ef3f9/sensors-17-02359-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/c0c2714b59d8/sensors-17-02359-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/d2979abaf16b/sensors-17-02359-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/38ef7337dcce/sensors-17-02359-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/05710bc56d45/sensors-17-02359-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/d39a6ce8c6ba/sensors-17-02359-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/93d2fc147960/sensors-17-02359-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/8abcc685615e/sensors-17-02359-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/fe794f6fe0f5/sensors-17-02359-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/4970990f472e/sensors-17-02359-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/1235760e8ade/sensors-17-02359-g030.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/9a1d147a0313/sensors-17-02359-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/d54e91cf3b67/sensors-17-02359-g032.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/a823b98db578/sensors-17-02359-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/c8d05cdb1b1a/sensors-17-02359-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/ef3362d48dc1/sensors-17-02359-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/14acce073d83/sensors-17-02359-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/a31902820c08/sensors-17-02359-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/6545e543a712/sensors-17-02359-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/07fd70b6c519/sensors-17-02359-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/7f8b49a11e3b/sensors-17-02359-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/290c33f3a725/sensors-17-02359-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/d057afc84497/sensors-17-02359-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/813bae50fed1/sensors-17-02359-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/254595792834/sensors-17-02359-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/703ef4f0f149/sensors-17-02359-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/b72b71b50692/sensors-17-02359-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/2e79fdc94e5e/sensors-17-02359-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/bb40ee91751a/sensors-17-02359-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/3c8f833da028/sensors-17-02359-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/980d8b4680b0/sensors-17-02359-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/898aed83b401/sensors-17-02359-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/2693c15ef3f9/sensors-17-02359-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/c0c2714b59d8/sensors-17-02359-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/d2979abaf16b/sensors-17-02359-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/38ef7337dcce/sensors-17-02359-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/05710bc56d45/sensors-17-02359-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/d39a6ce8c6ba/sensors-17-02359-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/93d2fc147960/sensors-17-02359-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/8abcc685615e/sensors-17-02359-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/fe794f6fe0f5/sensors-17-02359-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/4970990f472e/sensors-17-02359-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/1235760e8ade/sensors-17-02359-g030.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/9a1d147a0313/sensors-17-02359-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/15d2/5676663/d54e91cf3b67/sensors-17-02359-g032.jpg

相似文献

1
A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application.一种用于精确车辆定位和自动驾驶应用的低成本传感器方法。
Sensors (Basel). 2017 Oct 16;17(10):2359. doi: 10.3390/s17102359.
2
Lane-Level Map-Matching Method for Vehicle Localization Using GPS and Camera on a High-Definition Map.基于高清地图利用全球定位系统和摄像头进行车辆定位的车道级地图匹配方法
Sensors (Basel). 2020 Apr 11;20(8):2166. doi: 10.3390/s20082166.
3
Lane Detection Aided Online Dead Reckoning for GNSS Denied Environments.基于车道线检测的 GNSS 拒止环境下的在线航位推算。
Sensors (Basel). 2021 Oct 13;21(20):6805. doi: 10.3390/s21206805.
4
Comprehensive and Practical Vision System for Self-Driving Vehicle Lane-Level Localization.自动驾驶车辆车道级定位的综合实用视觉系统。
IEEE Trans Image Process. 2016 May;25(5):2075-88. doi: 10.1109/TIP.2016.2539683. Epub 2016 Mar 8.
5
Passive Sensor Integration for Vehicle Self-Localization in Urban Traffic Environment.用于城市交通环境中车辆自定位的被动传感器集成
Sensors (Basel). 2015 Dec 3;15(12):30199-220. doi: 10.3390/s151229795.
6
Development of an Autonomous Driving Vehicle for Garbage Collection in Residential Areas.自主驾驶车辆在居民区的垃圾收集应用开发。
Sensors (Basel). 2022 Nov 23;22(23):9094. doi: 10.3390/s22239094.
7
Automated Lane Centering: An Off-the-Shelf Computer Vision Product vs. Infrastructure-Based Chip-Enabled Raised Pavement Markers.自动车道居中:现成的计算机视觉产品与基于基础设施的带芯片凸起路面标记的对比。
Sensors (Basel). 2024 Apr 5;24(7):2327. doi: 10.3390/s24072327.
8
Map-Matching-Based Localization Using Camera and Low-Cost GPS for Lane-Level Accuracy.基于地图匹配的定位:利用摄像头和低成本全球定位系统实现车道级精度
Sensors (Basel). 2022 Mar 22;22(7):2434. doi: 10.3390/s22072434.
9
A Novel Approach to Global Positioning System Accuracy Assessment, Verified on LiDAR Alignment of One Million Kilometers at a Continent Scale, as a Foundation for Autonomous DRIVING Safety Analysis.一种全球定位系统精度评估的新方法,在大陆尺度上通过一百万公里的激光雷达校准进行验证,作为自动驾驶安全分析的基础。
Sensors (Basel). 2021 Aug 24;21(17):5691. doi: 10.3390/s21175691.
10
A Survey of Localization Methods for Autonomous Vehicles in Highway Scenarios.高速公路场景下自动驾驶车辆定位方法综述
Sensors (Basel). 2021 Dec 30;22(1):247. doi: 10.3390/s22010247.

引用本文的文献

1
Exploring the key technologies and applications of 6G wireless communication network.探索6G无线通信网络的关键技术与应用。
iScience. 2025 Mar 25;28(5):112281. doi: 10.1016/j.isci.2025.112281. eCollection 2025 May 16.
2
Vehicle Localization Using Crowdsourced Data Collected on Urban Roads.利用在城市道路上收集的众包数据进行车辆定位
Sensors (Basel). 2024 Aug 27;24(17):5531. doi: 10.3390/s24175531.
3
Digital Control and Demodulation Algorithm for Compact Open-Loop Fiber-Optic Gyroscope.紧凑型开环光纤陀螺仪的数字控制与解调算法。
Sensors (Basel). 2023 Jan 28;23(3):1473. doi: 10.3390/s23031473.
4
ZRO Drift Reduction of MEMS Gyroscopes via Internal and Packaging Stress Release.通过内部和封装应力释放实现MEMS陀螺仪的零速率漂移降低
Micromachines (Basel). 2021 Oct 29;12(11):1329. doi: 10.3390/mi12111329.
5
Lane Detection Aided Online Dead Reckoning for GNSS Denied Environments.基于车道线检测的 GNSS 拒止环境下的在线航位推算。
Sensors (Basel). 2021 Oct 13;21(20):6805. doi: 10.3390/s21206805.
6
Performance of Quad Mass Gyroscope in the Angular Rate Mode.四质量陀螺仪在角速率模式下的性能
Micromachines (Basel). 2021 Mar 4;12(3):266. doi: 10.3390/mi12030266.
7
Modular Approach for Odometry Localization Method for Vehicles with Increased Maneuverability.具有高机动性车辆的里程计定位方法的模块化方法。
Sensors (Basel). 2020 Dec 25;21(1):79. doi: 10.3390/s21010079.
8
Self-Driving Car Location Estimation Based on a Particle-Aided Unscented Kalman Filter.基于粒子辅助无迹卡尔曼滤波器的自动驾驶汽车位置估计
Sensors (Basel). 2020 Apr 29;20(9):2544. doi: 10.3390/s20092544.
9
Monocular Localization with Vector HD Map (MLVHM): A Low-Cost Method for Commercial IVs.基于矢量高清地图的单目定位(MLVHM):一种用于商用智能车辆的低成本方法。
Sensors (Basel). 2020 Mar 27;20(7):1870. doi: 10.3390/s20071870.
10
Research on Lane a Compensation Method Based on Multi-Sensor Fusion.基于多传感器融合的车道补偿方法研究
Sensors (Basel). 2019 Apr 2;19(7):1584. doi: 10.3390/s19071584.