• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于神经网络的移动机器人传感器数据融合。

Sensor Data Fusion for a Mobile Robot Using Neural Networks.

机构信息

Tecnologico de Monterrey, Escuela de Ingenieria y Ciencias, Av. Epigmenio González 500, Fracc. San Pablo, Querétaro 76130, Mexico.

Tecnologico de Monterrey, Escuela de Ingenieria y Ciencias, Av. Eugenio Garza 10 Sada 300, Lomas del Tecnológico, San Luis Potosí 78211, Mexico.

出版信息

Sensors (Basel). 2021 Dec 31;22(1):305. doi: 10.3390/s22010305.

DOI:10.3390/s22010305
PMID:35009849
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8749872/
Abstract

Mobile robots must be capable to obtain an accurate map of their surroundings to move within it. To detect different materials that might be undetectable to one sensor but not others it is necessary to construct at least a two-sensor fusion scheme. With this, it is possible to generate a 2D occupancy map in which glass obstacles are identified. An artificial neural network is used to fuse data from a tri-sensor (RealSense Stereo camera, 2D 360° LiDAR, and Ultrasonic Sensors) setup capable of detecting glass and other materials typically found in indoor environments that may or may not be visible to traditional 2D LiDAR sensors, hence the expression improved LiDAR. A preprocessing scheme is implemented to filter all the outliers, project a 3D pointcloud to a 2D plane and adjust distance data. With a Neural Network as a data fusion algorithm, we integrate all the information into a single, more accurate distance-to-obstacle reading to finally generate a 2D Occupancy Grid Map (OGM) that considers all sensors information. The Robotis Turtlebot3 Waffle Pi robot is used as the experimental platform to conduct experiments given the different fusion strategies. Test results show that with such a fusion algorithm, it is possible to detect glass and other obstacles with an estimated root-mean-square error (RMSE) of 3 cm with multiple fusion strategies.

摘要

移动机器人必须能够获取其周围环境的精确地图,以便在其中移动。为了检测可能被一个传感器检测不到但其他传感器可以检测到的不同材料,有必要构建至少一个两传感器融合方案。有了这个方案,可以生成一个可以识别玻璃障碍物的二维占据栅格地图。使用人工神经网络融合来自三传感器(RealSense 立体相机、2D 360°激光雷达和超声波传感器)设置的数据,该设置能够检测室内环境中通常存在的玻璃和其他材料,这些材料可能对传统的 2D 激光雷达传感器可见或不可见,因此表达为改进的激光雷达。实现了一个预处理方案来过滤所有的异常值,将 3D 点云投影到 2D 平面并调整距离数据。使用神经网络作为数据融合算法,我们将所有信息集成到一个更准确的单个障碍物距离读数中,最终生成一个考虑到所有传感器信息的二维占据栅格地图(OGM)。使用 Robotis Turtlebot3 Waffle Pi 机器人作为实验平台,根据不同的融合策略进行实验。测试结果表明,使用这种融合算法,可以使用多种融合策略检测玻璃和其他障碍物,估计均方根误差(RMSE)为 3 厘米。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/06bfb4272bcf/sensors-22-00305-g037.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/b0249c58117f/sensors-22-00305-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/ccc6631dc988/sensors-22-00305-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/4d65a41cea91/sensors-22-00305-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/d7c695e52cc2/sensors-22-00305-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/703d7fa2b2e1/sensors-22-00305-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/9405728704f9/sensors-22-00305-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/cd72e9fb0af1/sensors-22-00305-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/38be1fe3a01b/sensors-22-00305-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/a623a39844f4/sensors-22-00305-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/5895a7e20ff4/sensors-22-00305-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/a69226e552e8/sensors-22-00305-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/c66b335a25b4/sensors-22-00305-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/4ac45c30e9e4/sensors-22-00305-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/d6cce839d3f8/sensors-22-00305-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/58c6bddeba96/sensors-22-00305-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/96369bed93cb/sensors-22-00305-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/276546fdfd8c/sensors-22-00305-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/5a93b2b51a6f/sensors-22-00305-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/3b768d954ee6/sensors-22-00305-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/0c227b72dce1/sensors-22-00305-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/7f7e03e2c19d/sensors-22-00305-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/79dedef8cbcb/sensors-22-00305-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/2c58a6668e2c/sensors-22-00305-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/6952349d4d68/sensors-22-00305-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/4de30ae7dfec/sensors-22-00305-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/40956062fff3/sensors-22-00305-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/d6677d2eb4b0/sensors-22-00305-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/5187e0bf3986/sensors-22-00305-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/367729b8ffd9/sensors-22-00305-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/727c48b36d64/sensors-22-00305-g030.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/c9c6fa965811/sensors-22-00305-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/d301533c6ac3/sensors-22-00305-g032.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/510e631e103a/sensors-22-00305-g033.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/60709784be1b/sensors-22-00305-g034.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/6e18111cd4d8/sensors-22-00305-g035.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/372e91d1f4c9/sensors-22-00305-g036.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/06bfb4272bcf/sensors-22-00305-g037.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/b0249c58117f/sensors-22-00305-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/ccc6631dc988/sensors-22-00305-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/4d65a41cea91/sensors-22-00305-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/d7c695e52cc2/sensors-22-00305-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/703d7fa2b2e1/sensors-22-00305-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/9405728704f9/sensors-22-00305-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/cd72e9fb0af1/sensors-22-00305-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/38be1fe3a01b/sensors-22-00305-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/a623a39844f4/sensors-22-00305-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/5895a7e20ff4/sensors-22-00305-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/a69226e552e8/sensors-22-00305-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/c66b335a25b4/sensors-22-00305-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/4ac45c30e9e4/sensors-22-00305-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/d6cce839d3f8/sensors-22-00305-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/58c6bddeba96/sensors-22-00305-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/96369bed93cb/sensors-22-00305-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/276546fdfd8c/sensors-22-00305-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/5a93b2b51a6f/sensors-22-00305-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/3b768d954ee6/sensors-22-00305-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/0c227b72dce1/sensors-22-00305-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/7f7e03e2c19d/sensors-22-00305-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/79dedef8cbcb/sensors-22-00305-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/2c58a6668e2c/sensors-22-00305-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/6952349d4d68/sensors-22-00305-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/4de30ae7dfec/sensors-22-00305-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/40956062fff3/sensors-22-00305-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/d6677d2eb4b0/sensors-22-00305-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/5187e0bf3986/sensors-22-00305-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/367729b8ffd9/sensors-22-00305-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/727c48b36d64/sensors-22-00305-g030.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/c9c6fa965811/sensors-22-00305-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/d301533c6ac3/sensors-22-00305-g032.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/510e631e103a/sensors-22-00305-g033.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/60709784be1b/sensors-22-00305-g034.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/6e18111cd4d8/sensors-22-00305-g035.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/372e91d1f4c9/sensors-22-00305-g036.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2620/8749872/06bfb4272bcf/sensors-22-00305-g037.jpg

相似文献

1
Sensor Data Fusion for a Mobile Robot Using Neural Networks.基于神经网络的移动机器人传感器数据融合。
Sensors (Basel). 2021 Dec 31;22(1):305. doi: 10.3390/s22010305.
2
Environment Mapping Using Sensor Fusion of 2D Laser Scanner and 3D Ultrasonic Sensor for a Real Mobile Robot.用于真实移动机器人的基于二维激光扫描仪和三维超声波传感器融合的环境映射
Sensors (Basel). 2021 May 4;21(9):3184. doi: 10.3390/s21093184.
3
Multi-Robot 2.5D Localization and Mapping Using a Monte Carlo Algorithm on a Multi-Level Surface.基于多层面上的蒙特卡罗算法的多机器人 2.5D 定位与建图。
Sensors (Basel). 2021 Jul 4;21(13):4588. doi: 10.3390/s21134588.
4
Multi-LiDAR Mapping for Scene Segmentation in Indoor Environments for Mobile Robots.多激光雷达在移动机器人室内环境场景分割中的应用。
Sensors (Basel). 2022 May 12;22(10):3690. doi: 10.3390/s22103690.
5
Sensor fusion by pseudo information measure: a mobile robot application.基于伪信息测度的传感器融合:移动机器人应用
ISA Trans. 2002 Jul;41(3):283-301. doi: 10.1016/s0019-0578(07)60088-3.
6
Rao-Blackwellized Particle Filter Algorithm Integrated with Neural Network Sensor Model Using Laser Distance Sensor.结合激光距离传感器神经网络传感器模型的 Rao-Blackwellized 粒子滤波算法
Micromachines (Basel). 2023 Feb 27;14(3):560. doi: 10.3390/mi14030560.
7
Autonomous Navigation by Mobile Robot with Sensor Fusion Based on Deep Reinforcement Learning.基于深度强化学习的传感器融合移动机器人自主导航
Sensors (Basel). 2024 Jun 16;24(12):3895. doi: 10.3390/s24123895.
8
Obstacle Avoidance of Multi-Sensor Intelligent Robot Based on Road Sign Detection.基于路标检测的多传感器智能机器人避障。
Sensors (Basel). 2021 Oct 12;21(20):6777. doi: 10.3390/s21206777.
9
Neural network-based multiple robot simultaneous localization and mapping.基于神经网络的多机器人同步定位与地图构建
IEEE Trans Neural Netw. 2011 Dec;22(12):2376-87. doi: 10.1109/TNN.2011.2176541. Epub 2011 Dec 5.
10
Fuzzy Guided Autonomous Nursing Robot through Wireless Beacon Network.基于无线信标网络的模糊引导自主护理机器人。
Multimed Tools Appl. 2022;81(3):3297-3325. doi: 10.1007/s11042-021-11264-6. Epub 2021 Jul 29.

引用本文的文献

1
A New Association Approach for Multi-Sensor Air Traffic Surveillance Data Based on Deep Neural Networks.一种基于深度神经网络的多传感器空中交通监视数据关联新方法。
Sensors (Basel). 2025 Feb 4;25(3):931. doi: 10.3390/s25030931.
2
Enhancing Off-Road Topography Estimation by Fusing LIDAR and Stereo Camera Data with Interpolated Ground Plane.通过将激光雷达和立体相机数据与插值地面平面融合来增强越野地形估计
Sensors (Basel). 2025 Jan 16;25(2):509. doi: 10.3390/s25020509.
3
Development of a Neural Network for Target Gas Detection in Interdigitated Electrode Sensor-Based E-Nose Systems.

本文引用的文献

1
LiDAR-Based Glass Detection for Improved Occupancy Grid Mapping.基于激光雷达的玻璃检测以改进占用栅格地图构建
Sensors (Basel). 2021 Mar 24;21(7):2263. doi: 10.3390/s21072263.
2
Comparison of artificial neural network and logistic regression models for prediction of outcomes in trauma patients: A systematic review and meta-analysis.人工神经网络与逻辑回归模型对创伤患者结局预测的比较:一项系统评价与荟萃分析。
Injury. 2019 Feb;50(2):244-250. doi: 10.1016/j.injury.2019.01.007. Epub 2019 Jan 11.
3
An Adaptive Multi-Sensor Data Fusion Method Based on Deep Convolutional Neural Networks for Fault Diagnosis of Planetary Gearbox.
基于叉指电极传感器的电子鼻系统中用于目标气体检测的神经网络的开发。
Sensors (Basel). 2024 Aug 16;24(16):5315. doi: 10.3390/s24165315.
4
A Review of Sensing Technologies for Indoor Autonomous Mobile Robots.室内自主移动机器人传感技术综述
Sensors (Basel). 2024 Feb 14;24(4):1222. doi: 10.3390/s24041222.
5
Multi-LiDAR Mapping for Scene Segmentation in Indoor Environments for Mobile Robots.多激光雷达在移动机器人室内环境场景分割中的应用。
Sensors (Basel). 2022 May 12;22(10):3690. doi: 10.3390/s22103690.
一种基于深度卷积神经网络的自适应多传感器数据融合方法用于行星齿轮箱故障诊断
Sensors (Basel). 2017 Feb 21;17(2):414. doi: 10.3390/s17020414.
4
A review of data fusion techniques.数据融合技术综述。
ScientificWorldJournal. 2013 Oct 27;2013:704504. doi: 10.1155/2013/704504. eCollection 2013.