• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于激光和视觉的自主导航传感器数据融合技术综述。

Survey of Datafusion Techniques for Laser and Vision Based Sensor Integration for Autonomous Navigation.

机构信息

Department of Electrical Engineering, the University of Texas at San Antonio, 1, UTSA Cir., San Antonio, TX 78249, USA.

出版信息

Sensors (Basel). 2020 Apr 12;20(8):2180. doi: 10.3390/s20082180.

DOI:10.3390/s20082180
PMID:32290582
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7218742/
Abstract

This paper focuses on data fusion, which is fundamental to one of the most important modules in any autonomous system: perception. Over the past decade, there has been a surge in the usage of smart/autonomous mobility systems. Such systems can be used in various areas of life like safe mobility for the disabled, senior citizens, and so on and are dependent on accurate sensor information in order to function optimally. This information may be from a single sensor or a suite of sensors with the same or different modalities. We review various types of sensors, their data, and the need for fusion of the data with each other to output the best data for the task at hand, which in this case is autonomous navigation. In order to obtain such accurate data, we need to have optimal technology to read the sensor data, process the data, eliminate or at least reduce the noise and then use the data for the required tasks. We present a survey of the current data processing techniques that implement data fusion using different sensors like LiDAR that use light scan technology, stereo/depth cameras, Red Green Blue monocular (RGB) and Time-of-flight (TOF) cameras that use optical technology and review the efficiency of using fused data from multiple sensors rather than a single sensor in autonomous navigation tasks like mapping, obstacle detection, and avoidance or localization. This survey will provide sensor information to researchers who intend to accomplish the task of motion control of a robot and detail the use of LiDAR and cameras to accomplish robot navigation.

摘要

本文专注于数据融合,这是任何自主系统中最重要的模块之一:感知。在过去的十年中,智能/自主移动系统的使用呈爆炸式增长。此类系统可用于生活的各个领域,例如为残疾人士、老年人等提供安全的出行服务,并且依赖于准确的传感器信息以实现最佳运行。这些信息可能来自单个传感器,也可能来自具有相同或不同模态的一系列传感器。我们回顾了各种类型的传感器、它们的数据,以及将数据彼此融合以输出最适合手头任务的数据的需求,在这种情况下,任务是自主导航。为了获得如此准确的数据,我们需要有最佳的技术来读取传感器数据、处理数据、消除或至少减少噪声,然后将数据用于所需的任务。我们对当前的数据处理技术进行了调查,这些技术使用不同的传感器(如使用光扫描技术的激光雷达、立体/深度相机、使用光学技术的红绿蓝单目 (RGB) 和飞行时间 (TOF) 相机)来实现数据融合,并回顾了在自主导航任务(如地图绘制、障碍物检测和避免或定位)中使用来自多个传感器的融合数据而不是单个传感器的效率。本调查将为有意完成机器人运动控制任务的研究人员提供传感器信息,并详细介绍使用激光雷达和相机来完成机器人导航。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/0447cd3a02c6/sensors-20-02180-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/f223082432ed/sensors-20-02180-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/90a1c0f30aeb/sensors-20-02180-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/d8564ae206eb/sensors-20-02180-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/527e224c99cb/sensors-20-02180-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/5376a4a63c19/sensors-20-02180-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/52f7f0a00daa/sensors-20-02180-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/750fb9772910/sensors-20-02180-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/0447cd3a02c6/sensors-20-02180-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/f223082432ed/sensors-20-02180-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/90a1c0f30aeb/sensors-20-02180-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/d8564ae206eb/sensors-20-02180-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/527e224c99cb/sensors-20-02180-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/5376a4a63c19/sensors-20-02180-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/52f7f0a00daa/sensors-20-02180-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/750fb9772910/sensors-20-02180-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c963/7218742/0447cd3a02c6/sensors-20-02180-g008.jpg

相似文献

1
Survey of Datafusion Techniques for Laser and Vision Based Sensor Integration for Autonomous Navigation.基于激光和视觉的自主导航传感器数据融合技术综述。
Sensors (Basel). 2020 Apr 12;20(8):2180. doi: 10.3390/s20082180.
2
Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review.自动驾驶车辆中的传感器与传感器融合技术:综述。
Sensors (Basel). 2021 Mar 18;21(6):2140. doi: 10.3390/s21062140.
3
High Definition 3D Map Creation Using GNSS/IMU/LiDAR Sensor Integration to Support Autonomous Vehicle Navigation.利用全球导航卫星系统/惯性测量单元/激光雷达传感器集成创建高清3D地图以支持自动驾驶车辆导航。
Sensors (Basel). 2020 Feb 7;20(3):899. doi: 10.3390/s20030899.
4
Obstacle Detection System for Agricultural Mobile Robot Application Using RGB-D Cameras.基于 RGB-D 相机的农业移动机器人障碍物检测系统。
Sensors (Basel). 2021 Aug 5;21(16):5292. doi: 10.3390/s21165292.
5
GNSS/LiDAR-Based Navigation of an Aerial Robot in Sparse Forests.基于 GNSS/LiDAR 的稀疏林区空中机器人导航。
Sensors (Basel). 2019 Sep 20;19(19):4061. doi: 10.3390/s19194061.
6
Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots.激光雷达和广角相机数据的稳健融合在自主移动机器人中的应用。
Sensors (Basel). 2018 Aug 20;18(8):2730. doi: 10.3390/s18082730.
7
A Review of Visual-LiDAR Fusion based Simultaneous Localization and Mapping.基于视觉激光雷达融合的同时定位与建图综述
Sensors (Basel). 2020 Apr 7;20(7):2068. doi: 10.3390/s20072068.
8
A 2.5D Map-Based Mobile Robot Localization via Cooperation of Aerial and Ground Robots.一种通过空中和地面机器人协作实现的基于2.5D地图的移动机器人定位
Sensors (Basel). 2017 Nov 25;17(12):2730. doi: 10.3390/s17122730.
9
VA-LOAM: Visual Assist LiDAR Odometry and Mapping for Accurate Autonomous Navigation.VA-LOAM:用于精确自主导航的视觉辅助激光雷达里程计与建图
Sensors (Basel). 2024 Jun 13;24(12):3831. doi: 10.3390/s24123831.
10
Enabling Autonomous Navigation for Affordable Scooters.实现经济型滑板车的自主导航。
Sensors (Basel). 2018 Jun 5;18(6):1829. doi: 10.3390/s18061829.

引用本文的文献

1
A Review of Research on SLAM Technology Based on the Fusion of LiDAR and Vision.基于激光雷达与视觉融合的同步定位与地图构建(SLAM)技术研究综述
Sensors (Basel). 2025 Feb 27;25(5):1447. doi: 10.3390/s25051447.
2
Emerging Topics in Joint Radio-Based Positioning, Sensing, and Communications.基于无线电的联合定位、传感与通信中的新兴主题。
Sensors (Basel). 2025 Feb 5;25(3):948. doi: 10.3390/s25030948.
3
Enhancing Off-Road Topography Estimation by Fusing LIDAR and Stereo Camera Data with Interpolated Ground Plane.通过将激光雷达和立体相机数据与插值地面平面融合来增强越野地形估计

本文引用的文献

1
Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles.实时混合多传感器融合框架用于自动驾驶车辆的感知。
Sensors (Basel). 2019 Oct 9;19(20):4357. doi: 10.3390/s19204357.
2
Robust Fusion of LiDAR and Wide-Angle Camera Data for Autonomous Mobile Robots.激光雷达和广角相机数据的稳健融合在自主移动机器人中的应用。
Sensors (Basel). 2018 Aug 20;18(8):2730. doi: 10.3390/s18082730.
3
An Adaptive Multi-Sensor Data Fusion Method Based on Deep Convolutional Neural Networks for Fault Diagnosis of Planetary Gearbox.
Sensors (Basel). 2025 Jan 16;25(2):509. doi: 10.3390/s25020509.
4
Point Cloud Densification Algorithm for Multiple Cameras and Lidars Data Fusion.用于多相机和激光雷达数据融合的点云致密化算法
Sensors (Basel). 2024 Sep 5;24(17):5786. doi: 10.3390/s24175786.
5
A comprehensive review of navigation systems for visually impaired individuals.针对视障人士的导航系统综合综述。
Heliyon. 2024 May 23;10(11):e31825. doi: 10.1016/j.heliyon.2024.e31825. eCollection 2024 Jun 15.
6
Data Fusion of RGB and Depth Data with Image Enhancement.RGB与深度数据的数据融合及图像增强
J Imaging. 2024 Mar 21;10(3):73. doi: 10.3390/jimaging10030073.
7
Triangle-Mesh-Rasterization-Projection (TMRP): An Algorithm to Project a Point Cloud onto a Consistent, Dense and Accurate 2D Raster Image.三角形网格光栅化投影(TMRP):一种将点云投影到一致、密集且精确的二维光栅图像上的算法。
Sensors (Basel). 2023 Aug 8;23(16):7030. doi: 10.3390/s23167030.
8
Determination of trajectories using IKZ/CF inertial navigation: Methodological proposal.使用IKZ/CF惯性导航确定轨迹:方法建议。
Heliyon. 2023 Feb 23;9(3):e13863. doi: 10.1016/j.heliyon.2023.e13863. eCollection 2023 Mar.
9
Sensors and System for Vehicle Navigation.车辆导航用传感器和系统。
Sensors (Basel). 2022 Feb 23;22(5):1723. doi: 10.3390/s22051723.
10
Human-Robot Perception in Industrial Environments: A Survey.工业环境中的人机感知:调查研究。
Sensors (Basel). 2021 Feb 24;21(5):1571. doi: 10.3390/s21051571.
一种基于深度卷积神经网络的自适应多传感器数据融合方法用于行星齿轮箱故障诊断
Sensors (Basel). 2017 Feb 21;17(2):414. doi: 10.3390/s17020414.
4
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks.更快的 R-CNN:基于区域建议网络的实时目标检测。
IEEE Trans Pattern Anal Mach Intell. 2017 Jun;39(6):1137-1149. doi: 10.1109/TPAMI.2016.2577031. Epub 2016 Jun 6.
5
Fast Feature Pyramids for Object Detection.快速目标检测特征金字塔。
IEEE Trans Pattern Anal Mach Intell. 2014 Aug;36(8):1532-45. doi: 10.1109/TPAMI.2014.2300479.
6
A review of data fusion techniques.数据融合技术综述。
ScientificWorldJournal. 2013 Oct 27;2013:704504. doi: 10.1155/2013/704504. eCollection 2013.
7
Sensor fusion of monocular cameras and laser rangefinders for line-based Simultaneous Localization and Mapping (SLAM) tasks in autonomous mobile robots.基于视觉-激光融合的自主移动机器人线特征同时定位与建图(SLAM)
Sensors (Basel). 2012;12(1):429-52. doi: 10.3390/s120100429. Epub 2012 Jan 4.
8
Online Multiperson Tracking-by-Detection from a Single, Uncalibrated Camera.基于单目未标定相机的在线多人目标跟踪与检测。
IEEE Trans Pattern Anal Mach Intell. 2011 Sep;33(9):1820-33. doi: 10.1109/TPAMI.2010.232. Epub 2010 Dec 23.
9
Object detection with discriminatively trained part-based models.基于判别式训练的部件模型的目标检测。
IEEE Trans Pattern Anal Mach Intell. 2010 Sep;32(9):1627-45. doi: 10.1109/TPAMI.2009.167.
10
Intelligent assistive technology applications to dementia care: current capabilities, limitations, and future challenges.智能辅助技术在痴呆症护理中的应用:当前能力、局限性及未来挑战。
Am J Geriatr Psychiatry. 2009 Feb;17(2):88-104. doi: 10.1097/JGP.0b013e318187dde5.