• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于视觉的无人机自主跟随移动平台和着陆

Vision-Based Autonomous Following of a Moving Platform and Landing for an Unmanned Aerial Vehicle.

机构信息

Institute for Mechatronics Engineering & Cyber-Physical Systems (IMECH), Universidad de Málaga, 29071 Málaga, Spain.

Instituto Superior Técnico (IST), Universidade de Lisboa, 1049-001 Lisboa, Portugal.

出版信息

Sensors (Basel). 2023 Jan 11;23(2):829. doi: 10.3390/s23020829.

DOI:10.3390/s23020829
PMID:36679628
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9862587/
Abstract

Interest in Unmanned Aerial Vehicles (UAVs) has increased due to their versatility and variety of applications, however their battery life limits their applications. Heterogeneous multi-robot systems can offer a solution to this limitation, by allowing an Unmanned Ground Vehicle (UGV) to serve as a recharging station for the aerial one. Moreover, cooperation between aerial and terrestrial robots allows them to overcome other individual limitations, such as communication link coverage or accessibility, and to solve highly complex tasks, e.g., environment exploration, infrastructure inspection or search and rescue. This work proposes a vision-based approach that enables an aerial robot to autonomously detect, follow, and land on a mobile ground platform. For this purpose, ArUcO fiducial markers are used to estimate the relative pose between the UAV and UGV by processing RGB images provided by a monocular camera on board the UAV. The pose estimation is fed to a trajectory planner and four decoupled controllers to generate speed set-points relative to the UAV. Using a cascade loop strategy, these set-points are then sent to the UAV autopilot for inner loop control. The proposed solution has been tested both in simulation, with a digital twin of a solar farm using ROS, Gazebo and Ardupilot Software-in-the-Loop (SiL); and in the real world at IST Lisbon's outdoor facilities, with a UAV built on the basis of a DJ550 Hexacopter and a modified Jackal ground robot from DJI and Clearpath Robotics, respectively. Pose estimation, trajectory planning and speed set-point are computed on board the UAV, using a Single Board Computer (SBC) running Ubuntu and ROS, without the need for external infrastructure.

摘要

由于其多功能性和各种应用,人们对无人机 (UAV) 的兴趣日益增加,然而,它们的电池寿命限制了它们的应用。异构多机器人系统可以通过允许无人地面车辆 (UGV) 作为空中无人机的充电站来解决这个限制。此外,空中和地面机器人之间的合作使它们能够克服其他单个机器人的限制,例如通信链路覆盖范围或可达性,并解决高度复杂的任务,例如环境探索、基础设施检查或搜索和救援。这项工作提出了一种基于视觉的方法,使无人机能够自主检测、跟踪和降落在移动地面平台上。为此,ArUcO 基准标记用于通过处理安装在无人机上的单目相机提供的 RGB 图像来估计无人机和 UGV 之间的相对姿态。姿态估计被馈送到轨迹规划器和四个解耦控制器,以生成相对于无人机的速度设定点。使用级联环策略,然后将这些设定点发送到无人机自动驾驶仪进行内环控制。该解决方案已在仿真中进行了测试,使用 ROS、Gazebo 和 Ardupilot 软件在环 (SiL) 模拟一个太阳能农场的数字双胞胎;并在 IST 里斯本的户外设施中进行了实际测试,使用基于 DJ550 六旋翼无人机和分别来自 DJI 和 Clearpath Robotics 的改装 Jackal 地面机器人的无人机。姿态估计、轨迹规划和速度设定点在安装在无人机上的单板计算机 (SBC) 上计算,该 SBC 运行 Ubuntu 和 ROS,无需外部基础设施。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/25df5512f93b/sensors-23-00829-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/bf1573927858/sensors-23-00829-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/c04107b41cf8/sensors-23-00829-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/c95856dd76e2/sensors-23-00829-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/00add4f257ab/sensors-23-00829-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/3bb530b8ea54/sensors-23-00829-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/0ebcf6a8ddc1/sensors-23-00829-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/51cfab28f170/sensors-23-00829-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/12db81feb0c3/sensors-23-00829-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/35abe6c68d44/sensors-23-00829-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/5a701dd8a230/sensors-23-00829-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/6e170c263625/sensors-23-00829-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/bfc67819340d/sensors-23-00829-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/27a37b0823cb/sensors-23-00829-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/14db935b7c5d/sensors-23-00829-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/1c2ecd8fd211/sensors-23-00829-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/915d2e625cdb/sensors-23-00829-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/2a2b49cc115b/sensors-23-00829-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/25df5512f93b/sensors-23-00829-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/bf1573927858/sensors-23-00829-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/c04107b41cf8/sensors-23-00829-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/c95856dd76e2/sensors-23-00829-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/00add4f257ab/sensors-23-00829-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/3bb530b8ea54/sensors-23-00829-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/0ebcf6a8ddc1/sensors-23-00829-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/51cfab28f170/sensors-23-00829-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/12db81feb0c3/sensors-23-00829-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/35abe6c68d44/sensors-23-00829-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/5a701dd8a230/sensors-23-00829-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/6e170c263625/sensors-23-00829-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/bfc67819340d/sensors-23-00829-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/27a37b0823cb/sensors-23-00829-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/14db935b7c5d/sensors-23-00829-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/1c2ecd8fd211/sensors-23-00829-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/915d2e625cdb/sensors-23-00829-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/2a2b49cc115b/sensors-23-00829-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1a4a/9862587/25df5512f93b/sensors-23-00829-g018.jpg

相似文献

1
Vision-Based Autonomous Following of a Moving Platform and Landing for an Unmanned Aerial Vehicle.基于视觉的无人机自主跟随移动平台和着陆
Sensors (Basel). 2023 Jan 11;23(2):829. doi: 10.3390/s23020829.
2
Vision-based safe autonomous UAV docking with panoramic sensors.基于视觉的无人机与全景传感器的安全自主对接
Front Robot AI. 2023 Nov 23;10:1223157. doi: 10.3389/frobt.2023.1223157. eCollection 2023.
3
Autonomous Landing of Quadrotor Unmanned Aerial Vehicles Based on Multi-Level Marker and Linear Active Disturbance Reject Control.基于多级标记和线性自抗扰控制的四旋翼无人机自主着陆
Sensors (Basel). 2024 Mar 2;24(5):1645. doi: 10.3390/s24051645.
4
Visual Servoing Approach to Autonomous UAV Landing on a Moving Vehicle.基于视觉伺服的无人机自主降落在移动车辆上的方法。
Sensors (Basel). 2022 Aug 30;22(17):6549. doi: 10.3390/s22176549.
5
IoT Security and Computation Management on a Multi-Robot System for Rescue Operations Based on a Cloud Framework.基于云框架的救援行动多机器人系统中的物联网安全和计算管理。
Sensors (Basel). 2022 Jul 26;22(15):5569. doi: 10.3390/s22155569.
6
Analysis on security-related concerns of unmanned aerial vehicle: attacks, limitations, and recommendations.分析与无人机安全相关的关注点:攻击、限制因素和建议。
Math Biosci Eng. 2022 Jan 10;19(3):2641-2670. doi: 10.3934/mbe.2022121.
7
Proactive Guidance for Accurate UAV Landing on a Dynamic Platform: A Visual-Inertial Approach.用于无人机在动态平台上精确着陆的主动引导:一种视觉惯性方法。
Sensors (Basel). 2022 Jan 5;22(1):404. doi: 10.3390/s22010404.
8
Cooperative UAV-UGV Autonomous Power Pylon Inspection: An Investigation of Cooperative Outdoor Vehicle Positioning Architecture.无人机-无人地面车辆协作式自主电力塔架巡检:协作式户外车辆定位架构研究
Sensors (Basel). 2020 Nov 9;20(21):6384. doi: 10.3390/s20216384.
9
Autonomous Unmanned Aerial Vehicles in Search and Rescue Missions Using Real-Time Cooperative Model Predictive Control.自主式无人机在搜救任务中的应用——基于实时协同模型预测控制
Sensors (Basel). 2019 Sep 20;19(19):4067. doi: 10.3390/s19194067.
10
UAV sensor failures dataset: Biomisa arducopter sensory critique (BASiC).无人机传感器故障数据集:Biomisa多旋翼飞行器传感评估(BASiC)。
Data Brief. 2024 Jan 15;52:110069. doi: 10.1016/j.dib.2024.110069. eCollection 2024 Feb.

引用本文的文献

1
The IoRT-in-Hand: Tele-Robotic Echography and Digital Twins on Mobile Devices.手持物联网:移动设备上的远程机器人超声检查与数字孪生
Sensors (Basel). 2025 Aug 11;25(16):4972. doi: 10.3390/s25164972.
2
A Robust Routing Protocol in Cognitive Unmanned Aerial Vehicular Networks.认知无人机网络中的一种稳健路由协议
Sensors (Basel). 2024 Sep 30;24(19):6334. doi: 10.3390/s24196334.

本文引用的文献

1
Visual Servoed Autonomous Landing of an UAV on a Catamaran in a Marine Environment.在海洋环境中,使用视觉伺服控制实现无人机对双体船的自主着陆。
Sensors (Basel). 2022 May 6;22(9):3544. doi: 10.3390/s22093544.
2
Autonomous Quadcopter Landing on a Moving Target.自主四旋翼飞行器在移动目标上降落。
Sensors (Basel). 2022 Feb 1;22(3):1116. doi: 10.3390/s22031116.
3
Proactive Guidance for Accurate UAV Landing on a Dynamic Platform: A Visual-Inertial Approach.用于无人机在动态平台上精确着陆的主动引导:一种视觉惯性方法。
Sensors (Basel). 2022 Jan 5;22(1):404. doi: 10.3390/s22010404.