• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于二维边界的水下环境探索和测绘视点生成。

Two-Dimensional Frontier-Based Viewpoint Generation for Exploring and Mapping Underwater Environments.

机构信息

Underwater Robotics Research Center (CIRS), Computer Vision and Robotics Institute (VICOROB), Universitat de Girona, 17003 Girona, Spain.

Department of Computer Science, Rice University, Houston, TX 77005, USA.

出版信息

Sensors (Basel). 2019 Mar 25;19(6):1460. doi: 10.3390/s19061460.

DOI:10.3390/s19061460
PMID:30934639
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6470787/
Abstract

To autonomously explore complex underwater environments, it is convenient to develop motion planning strategies that do not depend on prior information. In this publication, we present a robotic exploration algorithm for autonomous underwater vehicles (AUVs) that is able to guide the robot so that it explores an unknown 2-dimensional (2D) environment. The algorithm is built upon view planning (VP) and frontier-based (FB) strategies. Traditional robotic exploration algorithms seek full coverage of the scene with data from only one sensor. If data coverage is required for multiple sensors, multiple exploration missions are required. Our approach has been designed to sense the environment achieving full coverage with data from two sensors in a single exploration mission: occupancy data from the profiling sonar, from which the shape of the environment is perceived, and optical data from the camera, to capture the details of the environment. This saves time and mission costs. The algorithm has been designed to be computationally efficient, so that it can run online in the AUV's onboard computer. In our approach, the environment is represented using a labeled quadtree occupancy map which, at the same time, is used to generate the viewpoints that guide the exploration. We have tested the algorithm in different environments through numerous experiments, which include sea operations using the Sparus II AUV and its sensor suite.

摘要

为了自主探索复杂的水下环境,开发不依赖于先验信息的运动规划策略是很方便的。在本出版物中,我们提出了一种用于自主水下机器人(AUV)的机器人探索算法,该算法能够引导机器人探索未知的二维(2D)环境。该算法基于视图规划(VP)和基于前沿的(FB)策略构建。传统的机器人探索算法寻求仅使用一个传感器的数据来实现场景的全覆盖。如果需要多个传感器的数据覆盖,则需要进行多次探索任务。我们的方法旨在通过单次探索任务中的两个传感器来感知环境并实现全覆盖:来自测深声纳的占用数据,用于感知环境的形状,以及来自相机的光学数据,用于捕获环境的细节。这节省了时间和任务成本。该算法被设计为具有计算效率,以便可以在 AUV 的机载计算机上在线运行。在我们的方法中,环境使用标记的四叉树占用图表示,同时,该图也用于生成指导探索的视点。我们已经通过多次实验对该算法进行了测试,其中包括使用 Sparus II AUV 及其传感器套件进行的海上操作。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/3f93cb00d2da/sensors-19-01460-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/68090db1c2dc/sensors-19-01460-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/31cef4367d89/sensors-19-01460-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/a967a043b9c8/sensors-19-01460-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/c6ce9be837a3/sensors-19-01460-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/db5fec5ee05e/sensors-19-01460-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/891f6dc3319e/sensors-19-01460-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/2ae89e5bbd05/sensors-19-01460-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/67c31121d7d8/sensors-19-01460-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/207385f26ed8/sensors-19-01460-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/09f8b9067866/sensors-19-01460-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/f479a4d276fc/sensors-19-01460-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/9556da5ef393/sensors-19-01460-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/c5f8aef9dbfd/sensors-19-01460-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/d9939a9148ca/sensors-19-01460-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/a2c71e4434af/sensors-19-01460-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/1bded8b652e3/sensors-19-01460-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/d08df8ea55cd/sensors-19-01460-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/13f625671dfb/sensors-19-01460-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/b8d265a3294a/sensors-19-01460-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/6dc47ef7c778/sensors-19-01460-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/0d0d363f7d62/sensors-19-01460-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/d70db000b9bb/sensors-19-01460-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/dd2eb7b60c1e/sensors-19-01460-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/f894eda807e1/sensors-19-01460-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/3f93cb00d2da/sensors-19-01460-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/68090db1c2dc/sensors-19-01460-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/31cef4367d89/sensors-19-01460-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/a967a043b9c8/sensors-19-01460-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/c6ce9be837a3/sensors-19-01460-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/db5fec5ee05e/sensors-19-01460-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/891f6dc3319e/sensors-19-01460-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/2ae89e5bbd05/sensors-19-01460-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/67c31121d7d8/sensors-19-01460-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/207385f26ed8/sensors-19-01460-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/09f8b9067866/sensors-19-01460-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/f479a4d276fc/sensors-19-01460-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/9556da5ef393/sensors-19-01460-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/c5f8aef9dbfd/sensors-19-01460-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/d9939a9148ca/sensors-19-01460-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/a2c71e4434af/sensors-19-01460-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/1bded8b652e3/sensors-19-01460-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/d08df8ea55cd/sensors-19-01460-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/13f625671dfb/sensors-19-01460-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/b8d265a3294a/sensors-19-01460-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/6dc47ef7c778/sensors-19-01460-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/0d0d363f7d62/sensors-19-01460-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/d70db000b9bb/sensors-19-01460-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/dd2eb7b60c1e/sensors-19-01460-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/f894eda807e1/sensors-19-01460-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37a7/6470787/3f93cb00d2da/sensors-19-01460-g025.jpg

相似文献

1
Two-Dimensional Frontier-Based Viewpoint Generation for Exploring and Mapping Underwater Environments.基于二维边界的水下环境探索和测绘视点生成。
Sensors (Basel). 2019 Mar 25;19(6):1460. doi: 10.3390/s19061460.
2
Multi-AUV Target Search Based on Bioinspired Neurodynamics Model in 3-D Underwater Environments.基于三维水下环境生物启发神经动力学模型的多自主水下航行器目标搜索。
IEEE Trans Neural Netw Learn Syst. 2016 Nov;27(11):2364-2374. doi: 10.1109/TNNLS.2015.2482501. Epub 2015 Oct 16.
3
A Probabilistic and Highly Efficient Topology Control Algorithm for Underwater Cooperating AUV Networks.一种用于水下协作自主水下航行器网络的概率高效拓扑控制算法。
Sensors (Basel). 2017 May 4;17(5):1022. doi: 10.3390/s17051022.
4
AUV Path Planning Considering Ocean Current Disturbance Based on Cloud Desktop Technology.基于云桌面技术的考虑海流干扰的自主水下航行器路径规划
Sensors (Basel). 2023 Aug 29;23(17):7510. doi: 10.3390/s23177510.
5
Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments.未知自然环境中的自主水下导航与光学测绘
Sensors (Basel). 2016 Jul 26;16(8):1174. doi: 10.3390/s16081174.
6
A Predictive Guidance Obstacle Avoidance Algorithm for AUV in Unknown Environments.一种用于未知环境中自主水下航行器的预测制导避障算法
Sensors (Basel). 2019 Jun 27;19(13):2862. doi: 10.3390/s19132862.
7
A Real-Time Path Planning Algorithm for AUV in Unknown Underwater Environment Based on Combining PSO and Waypoint Guidance.基于 PSO 和航点制导的水下未知环境中 AUV 实时路径规划算法
Sensors (Basel). 2018 Dec 21;19(1):20. doi: 10.3390/s19010020.
8
An Adaptive Prediction Target Search Algorithm for Multi-AUVs in an Unknown 3D Environment.一种用于未知三维环境中多自主水下航行器的自适应预测目标搜索算法。
Sensors (Basel). 2018 Nov 9;18(11):3853. doi: 10.3390/s18113853.
9
Improved Artificial Potential Field Algorithm Assisted by Multisource Data for AUV Path Planning.多源数据辅助的改进人工势场算法用于自主水下航行器路径规划
Sensors (Basel). 2023 Jul 26;23(15):6680. doi: 10.3390/s23156680.
10
A Fuzzy-Based Risk Assessment Framework for Autonomous Underwater Vehicle Under-Ice Missions.一种用于自主水下航行器冰下任务的基于模糊逻辑的风险评估框架。
Risk Anal. 2019 Dec;39(12):2744-2765. doi: 10.1111/risa.13376. Epub 2019 Jul 18.

本文引用的文献

1
H-SLAM: Rao-Blackwellized Particle Filter SLAM Using Hilbert Maps.H-SLAM:基于希尔伯特映射的 Rao-Blackwellized 粒子滤波 SLAM
Sensors (Basel). 2018 May 1;18(5):1386. doi: 10.3390/s18051386.
2
Autonomous Underwater Navigation and Optical Mapping in Unknown Natural Environments.未知自然环境中的自主水下导航与光学测绘
Sensors (Basel). 2016 Jul 26;16(8):1174. doi: 10.3390/s16081174.