• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于 Gazebo 仿真的地面移动机器人自然环境自动标注数据集。

Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations.

机构信息

Robotics and Mechatronics Lab, Andalucía Tech, Universidad de Málaga, 29071 Málaga, Spain.

出版信息

Sensors (Basel). 2022 Jul 26;22(15):5599. doi: 10.3390/s22155599.

DOI:10.3390/s22155599
PMID:35898100
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9331783/
Abstract

This paper presents a new synthetic dataset obtained from Gazebo simulations of an Unmanned Ground Vehicle (UGV) moving on different natural environments. To this end, a Husky mobile robot equipped with a tridimensional (3D) Light Detection and Ranging (LiDAR) sensor, a stereo camera, a Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU) and wheel tachometers has followed several paths using the Robot Operating System (ROS). Both points from LiDAR scans and pixels from camera images, have been automatically labeled into their corresponding object class. For this purpose, unique reflectivity values and flat colors have been assigned to each object present in the modeled environments. As a result, a public dataset, which also includes 3D pose ground-truth, is provided as ROS bag files and as human-readable data. Potential applications include supervised learning and benchmarking for UGV navigation on natural environments. Moreover, to allow researchers to easily modify the dataset or to directly use the simulations, the required code has also been released.

摘要

本文提出了一个新的合成数据集,该数据集是通过在 Gazebo 中对无人地面车辆(UGV)在不同自然环境下的移动进行模拟得到的。为此,配备了三维(3D)激光雷达(LiDAR)传感器、立体相机、全球导航卫星系统(GNSS)接收器、惯性测量单元(IMU)和车轮转速计的 Husky 移动机器人使用机器人操作系统(ROS)遵循了几条路径。LiDAR 扫描的点和相机图像的像素都已自动标记为它们对应的对象类。为此,为模型环境中的每个对象分配了独特的反射率值和平坦颜色。作为结果,提供了一个公共数据集,其中还包括 3D 姿态地面真实值,以 ROS 数据包和人类可读数据的形式提供。潜在的应用包括对 UGV 在自然环境中的导航进行监督学习和基准测试。此外,为了允许研究人员轻松修改数据集或直接使用模拟,还发布了所需的代码。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/aeffeef2ab28/sensors-22-05599-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/6b067f7a7801/sensors-22-05599-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/b6d46f455735/sensors-22-05599-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/c516b8671e9f/sensors-22-05599-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/e15974206b28/sensors-22-05599-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/9be2008996fd/sensors-22-05599-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/349f957a4e50/sensors-22-05599-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/01868f02f85a/sensors-22-05599-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/a7d0faab5add/sensors-22-05599-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/3cbaef8e0f8d/sensors-22-05599-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/4266ef96b926/sensors-22-05599-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/55c3e98ceb1c/sensors-22-05599-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/ab9cc92073df/sensors-22-05599-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/f6f3b857567b/sensors-22-05599-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/58421399967f/sensors-22-05599-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/ca591362784e/sensors-22-05599-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/4d1de0dc861b/sensors-22-05599-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/e286ed9c9a74/sensors-22-05599-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/4c0dbf238ca0/sensors-22-05599-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/1d0bbeff541e/sensors-22-05599-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/5a786493f5df/sensors-22-05599-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/27bcb66fab16/sensors-22-05599-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/aeffeef2ab28/sensors-22-05599-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/6b067f7a7801/sensors-22-05599-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/b6d46f455735/sensors-22-05599-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/c516b8671e9f/sensors-22-05599-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/e15974206b28/sensors-22-05599-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/9be2008996fd/sensors-22-05599-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/349f957a4e50/sensors-22-05599-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/01868f02f85a/sensors-22-05599-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/a7d0faab5add/sensors-22-05599-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/3cbaef8e0f8d/sensors-22-05599-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/4266ef96b926/sensors-22-05599-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/55c3e98ceb1c/sensors-22-05599-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/ab9cc92073df/sensors-22-05599-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/f6f3b857567b/sensors-22-05599-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/58421399967f/sensors-22-05599-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/ca591362784e/sensors-22-05599-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/4d1de0dc861b/sensors-22-05599-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/e286ed9c9a74/sensors-22-05599-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/4c0dbf238ca0/sensors-22-05599-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/1d0bbeff541e/sensors-22-05599-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/5a786493f5df/sensors-22-05599-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/27bcb66fab16/sensors-22-05599-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fb58/9331783/aeffeef2ab28/sensors-22-05599-g022.jpg

相似文献

1
Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations.基于 Gazebo 仿真的地面移动机器人自然环境自动标注数据集。
Sensors (Basel). 2022 Jul 26;22(15):5599. doi: 10.3390/s22155599.
2
Reinforcement and Curriculum Learning for Off-Road Navigation of an UGV with a 3D LiDAR.基于 3D LiDAR 的 UGV 越野导航的增强与课程学习。
Sensors (Basel). 2023 Mar 18;23(6):3239. doi: 10.3390/s23063239.
3
NR5G-SAM: A SLAM Framework for Field Robot Applications Based on 5G New Radio.NR5G-SAM:一种基于 5G 新无线电的现场机器人应用的 SLAM 框架。
Sensors (Basel). 2023 Jun 5;23(11):5354. doi: 10.3390/s23115354.
4
A LiDAR-Camera-Inertial-GNSS Apparatus for 3D Multimodal Dataset Collection in Woodland Scenarios.一种用于林地场景三维多模态数据集采集的激光雷达-相机-惯性-全球导航卫星系统设备
Sensors (Basel). 2023 Jul 26;23(15):6676. doi: 10.3390/s23156676.
5
A GNSS/INS/LiDAR Integration Scheme for UAV-Based Navigation in GNSS-Challenging Environments.一种用于 GNSS 挑战性环境中基于无人机的导航的 GNSS/INS/LiDAR 集成方案。
Sensors (Basel). 2022 Dec 16;22(24):9908. doi: 10.3390/s22249908.
6
Vision-Based Autonomous Following of a Moving Platform and Landing for an Unmanned Aerial Vehicle.基于视觉的无人机自主跟随移动平台和着陆
Sensors (Basel). 2023 Jan 11;23(2):829. doi: 10.3390/s23020829.
7
Traversability Assessment and Trajectory Planning of Unmanned Ground Vehicles with Suspension Systems on Rough Terrain.崎岖地形下带悬架系统的无人地面车辆的可行驶性评估与轨迹规划。
Sensors (Basel). 2019 Oct 10;19(20):4372. doi: 10.3390/s19204372.
8
Reactive Navigation on Natural Environments by Continuous Classification of Ground Traversability.基于地面可通行性的连续分类的自然环境下的反应式导航
Sensors (Basel). 2020 Nov 10;20(22):6423. doi: 10.3390/s20226423.
9
JRDB: A Dataset and Benchmark of Egocentric Robot Visual Perception of Humans in Built Environments.JRDB:一个用于构建环境中自我中心机器人对人类视觉感知的数据集和基准
IEEE Trans Pattern Anal Mach Intell. 2023 Jun;45(6):6748-6765. doi: 10.1109/TPAMI.2021.3070543. Epub 2023 May 8.
10
Pole-Like Object Extraction and Pole-Aided GNSS/IMU/LiDAR-SLAM System in Urban Area.城区中杆状物体提取及基于杆辅助的GNSS/IMU/LiDAR-SLAM系统
Sensors (Basel). 2020 Dec 13;20(24):7145. doi: 10.3390/s20247145.

引用本文的文献

1
Reinforcement and Curriculum Learning for Off-Road Navigation of an UGV with a 3D LiDAR.基于 3D LiDAR 的 UGV 越野导航的增强与课程学习。
Sensors (Basel). 2023 Mar 18;23(6):3239. doi: 10.3390/s23063239.

本文引用的文献

1
Mobile Robot Localization and Mapping Algorithm Based on the Fusion of Image and Laser Point Cloud.基于图像与激光点云融合的移动机器人定位与地图构建算法
Sensors (Basel). 2022 May 28;22(11):4114. doi: 10.3390/s22114114.
2
Image Segmentation Using Deep Learning: A Survey.基于深度学习的图像分割技术综述。
IEEE Trans Pattern Anal Mach Intell. 2022 Jul;44(7):3523-3542. doi: 10.1109/TPAMI.2021.3059968. Epub 2022 Jun 3.
3
Learning-Based Methods of Perception and Navigation for Ground Vehicles in Unstructured Environments: A Review.
基于学习的非结构化环境中地面车辆感知与导航方法综述。
Sensors (Basel). 2020 Dec 25;21(1):73. doi: 10.3390/s21010073.
4
Reactive Navigation on Natural Environments by Continuous Classification of Ground Traversability.基于地面可通行性的连续分类的自然环境下的反应式导航
Sensors (Basel). 2020 Nov 10;20(22):6423. doi: 10.3390/s20226423.