• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于机器人的基础设施无损评估中的同步定位与地图构建综述

A Review of Simultaneous Localization and Mapping for the Robotic-Based Nondestructive Evaluation of Infrastructures.

作者信息

Ghadimzadeh Alamdari Ali, Zade Farzad Azizi, Ebrahimkhanlou Arvin

机构信息

Department of Mechanical Engineering and Mechanics (MEM), Drexel University, 3141 Chestnut St., Philadelphia, PA 19104, USA.

Mechanical Engineering Department, Ferdowsi University of Mashhad, Mashhad 9177948944, Iran.

出版信息

Sensors (Basel). 2025 Jan 24;25(3):712. doi: 10.3390/s25030712.

DOI:10.3390/s25030712
PMID:39943350
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11820643/
Abstract

The maturity of simultaneous localization and mapping (SLAM) methods has now reached a significant level that motivates in-depth and problem-specific reviews. The focus of this study is to investigate the evolution of vision-based, LiDAR-based, and a combination of these methods and evaluate their performance in enclosed and GPS-denied (EGD) conditions for infrastructure inspection. This paper categorizes and analyzes the SLAM methods in detail, considering the sensor fusion type and chronological order. The paper analyzes the performance of eleven open-source SLAM solutions, containing two visual (VINS-Mono, ORB-SLAM 2), eight LiDAR-based (LIO-SAM, Fast-LIO 2, SC-Fast-LIO 2, LeGO-LOAM, SC-LeGO-LOAM A-LOAM, LINS, F-LOAM) and one combination of the LiDAR and vision-based method (LVI-SAM). The benchmarking section analyzes accuracy and computational resource consumption using our collected dataset and a test dataset. According to the results, LiDAR-based methods performed well under EGD conditions. Contrary to common presumptions, some vision-based methods demonstrate acceptable performance in EGD environments. Additionally, combining vision-based techniques with LiDAR-based methods demonstrates superior performance compared to either vision-based or LiDAR-based methods individually.

摘要

同时定位与地图构建(SLAM)方法如今已发展成熟,达到了值得进行深入且针对具体问题的综述研究的水平。本研究的重点是探究基于视觉、基于激光雷达以及二者结合的方法的发展历程,并评估它们在封闭且无全球定位系统(GPS)信号环境(EGD)下进行基础设施检测时的性能。本文根据传感器融合类型和时间顺序,对SLAM方法进行了详细分类与分析。本文分析了十一种开源SLAM解决方案的性能,其中包括两种基于视觉的方案(VINS-Mono、ORB-SLAM 2)、八种基于激光雷达的方案(LIO-SAM、Fast-LIO 2、SC-Fast-LIO 2、LeGO-LOAM、SC-LeGO-LOAM、A-LOAM、LINS、F-LOAM)以及一种基于激光雷达与视觉的组合方法(LVI-SAM)。基准测试部分使用我们收集的数据集和一个测试数据集,分析了准确性和计算资源消耗情况。根据结果,基于激光雷达的方法在EGD条件下表现良好。与普遍认知相反,一些基于视觉的方法在EGD环境中也展现出了可接受的性能。此外,将基于视觉的技术与基于激光雷达的方法相结合,相比于单独的基于视觉或基于激光雷达的方法,展现出了更优越的性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/2182e0581a79/sensors-25-00712-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/63c031d20250/sensors-25-00712-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/7f72071e09f5/sensors-25-00712-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/e48bd1511ae6/sensors-25-00712-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/471a1b92d9b8/sensors-25-00712-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/76a74c7f2be3/sensors-25-00712-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/00d8fa01ce11/sensors-25-00712-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/cfae1dc3fee7/sensors-25-00712-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/01b74277e252/sensors-25-00712-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/a8b00fd0b450/sensors-25-00712-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/b93dc28a2bc1/sensors-25-00712-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/25a2ca4c154a/sensors-25-00712-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/baa93fe03f5b/sensors-25-00712-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/3f67fee6b69b/sensors-25-00712-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/608375ab7bf8/sensors-25-00712-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/bc6cf67c0081/sensors-25-00712-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/3b6e9d03cada/sensors-25-00712-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/c96172cfbd6a/sensors-25-00712-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/18fe6428d268/sensors-25-00712-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/55a897f1d336/sensors-25-00712-g019a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/c497fe633027/sensors-25-00712-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/98ee81ba75c5/sensors-25-00712-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/6679a7a853aa/sensors-25-00712-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/b25f6e15ec22/sensors-25-00712-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/18391a7e881f/sensors-25-00712-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/cba6c6a5aec3/sensors-25-00712-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/2b8e36769f64/sensors-25-00712-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/2182e0581a79/sensors-25-00712-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/63c031d20250/sensors-25-00712-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/7f72071e09f5/sensors-25-00712-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/e48bd1511ae6/sensors-25-00712-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/471a1b92d9b8/sensors-25-00712-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/76a74c7f2be3/sensors-25-00712-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/00d8fa01ce11/sensors-25-00712-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/cfae1dc3fee7/sensors-25-00712-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/01b74277e252/sensors-25-00712-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/a8b00fd0b450/sensors-25-00712-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/b93dc28a2bc1/sensors-25-00712-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/25a2ca4c154a/sensors-25-00712-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/baa93fe03f5b/sensors-25-00712-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/3f67fee6b69b/sensors-25-00712-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/608375ab7bf8/sensors-25-00712-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/bc6cf67c0081/sensors-25-00712-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/3b6e9d03cada/sensors-25-00712-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/c96172cfbd6a/sensors-25-00712-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/18fe6428d268/sensors-25-00712-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/55a897f1d336/sensors-25-00712-g019a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/c497fe633027/sensors-25-00712-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/98ee81ba75c5/sensors-25-00712-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/6679a7a853aa/sensors-25-00712-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/b25f6e15ec22/sensors-25-00712-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/18391a7e881f/sensors-25-00712-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/cba6c6a5aec3/sensors-25-00712-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/2b8e36769f64/sensors-25-00712-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4ec2/11820643/2182e0581a79/sensors-25-00712-g027.jpg

相似文献

1
A Review of Simultaneous Localization and Mapping for the Robotic-Based Nondestructive Evaluation of Infrastructures.基于机器人的基础设施无损评估中的同步定位与地图构建综述
Sensors (Basel). 2025 Jan 24;25(3):712. doi: 10.3390/s25030712.
2
VA-LOAM: Visual Assist LiDAR Odometry and Mapping for Accurate Autonomous Navigation.VA-LOAM:用于精确自主导航的视觉辅助激光雷达里程计与建图
Sensors (Basel). 2024 Jun 13;24(12):3831. doi: 10.3390/s24123831.
3
Stereo and LiDAR Loosely Coupled SLAM Constrained Ground Detection.立体视觉与激光雷达松耦合同步定位与地图构建中的地面检测约束
Sensors (Basel). 2024 Oct 24;24(21):6828. doi: 10.3390/s24216828.
4
A Review of Visual-LiDAR Fusion based Simultaneous Localization and Mapping.基于视觉激光雷达融合的同时定位与建图综述
Sensors (Basel). 2020 Apr 7;20(7):2068. doi: 10.3390/s20072068.
5
Sensor Fusion-Based Approach to Eliminating Moving Objects for SLAM in Dynamic Environments.基于传感器融合的动态环境中同时定位与地图构建时消除移动物体的方法
Sensors (Basel). 2021 Jan 1;21(1):230. doi: 10.3390/s21010230.
6
A Review of Research on SLAM Technology Based on the Fusion of LiDAR and Vision.基于激光雷达与视觉融合的同步定位与地图构建(SLAM)技术研究综述
Sensors (Basel). 2025 Feb 27;25(5):1447. doi: 10.3390/s25051447.
7
NR5G-SAM: A SLAM Framework for Field Robot Applications Based on 5G New Radio.NR5G-SAM:一种基于 5G 新无线电的现场机器人应用的 SLAM 框架。
Sensors (Basel). 2023 Jun 5;23(11):5354. doi: 10.3390/s23115354.
8
Research on 3D LiDAR outdoor SLAM algorithm based on LiDAR/IMU tight coupling.基于激光雷达/惯性测量单元紧密耦合的三维激光雷达室外即时定位与地图构建算法研究
Sci Rep. 2025 Apr 1;15(1):11175. doi: 10.1038/s41598-025-95730-3.
9
SLAM and 3D Semantic Reconstruction Based on the Fusion of Lidar and Monocular Vision.基于激光雷达和单目视觉融合的 SLAM 和 3D 语义重建。
Sensors (Basel). 2023 Jan 29;23(3):1502. doi: 10.3390/s23031502.
10
LeGO-LOAM-SC: An Improved Simultaneous Localization and Mapping Method Fusing LeGO-LOAM and Scan Context for Underground Coalmine.LeGO-LOAM-SC:融合 LeGO-LOAM 和扫描上下文的地下煤矿同时定位与建图改进方法。
Sensors (Basel). 2022 Jan 11;22(2):520. doi: 10.3390/s22020520.

引用本文的文献

1
A Multi-Sensor Fusion-Based Localization Method for a Magnetic Adhesion Wall-Climbing Robot.一种基于多传感器融合的磁吸附壁面攀爬机器人定位方法
Sensors (Basel). 2025 Aug 14;25(16):5051. doi: 10.3390/s25165051.
2
Semantic Fusion Algorithm of 2D LiDAR and Camera Based on Contour and Inverse Projection.基于轮廓和逆投影的二维激光雷达与相机语义融合算法
Sensors (Basel). 2025 Apr 17;25(8):2526. doi: 10.3390/s25082526.

本文引用的文献

1
A Review on Visual-SLAM: Advancements from Geometric Modelling to Learning-Based Semantic Scene Understanding Using Multi-Modal Sensor Fusion.基于多模态传感器融合的视觉 SLAM 综述:从几何建模到基于学习的语义场景理解的进展。
Sensors (Basel). 2022 Sep 25;22(19):7265. doi: 10.3390/s22197265.
2
LeGO-LOAM-SC: An Improved Simultaneous Localization and Mapping Method Fusing LeGO-LOAM and Scan Context for Underground Coalmine.LeGO-LOAM-SC:融合 LeGO-LOAM 和扫描上下文的地下煤矿同时定位与建图改进方法。
Sensors (Basel). 2022 Jan 11;22(2):520. doi: 10.3390/s22020520.
3
LiDAR-Based Glass Detection for Improved Occupancy Grid Mapping.
基于激光雷达的玻璃检测以改进占用栅格地图构建
Sensors (Basel). 2021 Mar 24;21(7):2263. doi: 10.3390/s21072263.
4
Role of Deep Learning in Loop Closure Detection for Visual and Lidar SLAM: A Survey.深度学习在视觉和激光雷达 SLAM 中的闭环检测中的作用:综述。
Sensors (Basel). 2021 Feb 10;21(4):1243. doi: 10.3390/s21041243.
5
Fast and Robust Iterative Closest Point.快速鲁棒迭代最近点
IEEE Trans Pattern Anal Mach Intell. 2022 Jul;44(7):3450-3466. doi: 10.1109/TPAMI.2021.3054619. Epub 2022 Jun 3.
6
A Review of Visual-LiDAR Fusion based Simultaneous Localization and Mapping.基于视觉激光雷达融合的同时定位与建图综述
Sensors (Basel). 2020 Apr 7;20(7):2068. doi: 10.3390/s20072068.
7
A Robust Method for Detecting Parking Areas in Both Indoor and Outdoor Environments.一种用于检测室内外环境中停车位的稳健方法。
Sensors (Basel). 2018 Jun 11;18(6):1903. doi: 10.3390/s18061903.
8
Direct Sparse Odometry.直接稀疏里程计。
IEEE Trans Pattern Anal Mach Intell. 2018 Mar;40(3):611-625. doi: 10.1109/TPAMI.2017.2658577. Epub 2017 Apr 12.
9
High-Speed Tracking with Kernelized Correlation Filters.基于核相关滤波器的高速跟踪。
IEEE Trans Pattern Anal Mach Intell. 2015 Mar;37(3):583-96. doi: 10.1109/TPAMI.2014.2345390.
10
Exploration. Making smarter, savvier robots.探索。制造更智能、更具洞察力的机器人。
Science. 2010 Jul 30;329(5991):508-9. doi: 10.1126/science.329.5991.508.