• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于无人机在户外低光照环境下自主着陆的实时单目视觉系统

Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments.

作者信息

Lin Shanggang, Jin Lianwen, Chen Ziwei

机构信息

School of Electronic and Information Engineering, South China University of Technology, Guangzhou 510640, China.

South China University of Technology-Zhuhai Institute of Modern Industrial Innovation, Zhuhai 519175, China.

出版信息

Sensors (Basel). 2021 Sep 16;21(18):6226. doi: 10.3390/s21186226.

DOI:10.3390/s21186226
PMID:34577433
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8471562/
Abstract

Landing an unmanned aerial vehicle (UAV) autonomously and safely is a challenging task. Although the existing approaches have resolved the problem of precise landing by identifying a specific landing marker using the UAV's onboard vision system, the vast majority of these works are conducted in either daytime or well-illuminated laboratory environments. In contrast, very few researchers have investigated the possibility of landing in low-illumination conditions by employing various active light sources to lighten the markers. In this paper, a novel vision system design is proposed to tackle UAV landing in outdoor extreme low-illumination environments without the need to apply an active light source to the marker. We use a model-based enhancement scheme to improve the quality and brightness of the onboard captured images, then present a hierarchical-based method consisting of a decision tree with an associated light-weight convolutional neural network (CNN) for coarse-to-fine landing marker localization, where the key information of the marker is extracted and reserved for post-processing, such as pose estimation and landing control. Extensive evaluations have been conducted to demonstrate the robustness, accuracy, and real-time performance of the proposed vision system. Field experiments across a variety of outdoor nighttime scenarios with an average luminance of 5 lx at the marker locations have proven the feasibility and practicability of the system.

摘要

让无人机自主且安全地着陆是一项具有挑战性的任务。尽管现有方法通过使用无人机的机载视觉系统识别特定着陆标记解决了精确着陆的问题,但这些工作绝大多数是在白天或光照良好的实验室环境中进行的。相比之下,很少有研究人员探讨通过使用各种有源光源照亮标记在低光照条件下着陆的可能性。在本文中,提出了一种新颖的视觉系统设计,以解决无人机在户外极低光照环境下的着陆问题,而无需对标记应用有源光源。我们使用基于模型的增强方案来提高机载捕获图像的质量和亮度,然后提出一种基于分层的方法,该方法由带有相关轻量级卷积神经网络(CNN)的决策树组成,用于从粗到精的着陆标记定位,其中标记的关键信息被提取并保留用于后处理,如姿态估计和着陆控制。已经进行了广泛的评估以证明所提出视觉系统的鲁棒性、准确性和实时性能。在标记位置平均亮度为5勒克斯的各种户外夜间场景中进行的现场实验证明了该系统的可行性和实用性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/ee0b12df21a8/sensors-21-06226-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/3548b65d91d7/sensors-21-06226-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/64baf86df0f2/sensors-21-06226-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/624b3a537cff/sensors-21-06226-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/5c565a6082d3/sensors-21-06226-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/b3bcd4738424/sensors-21-06226-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/beb48dbc37b9/sensors-21-06226-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/7e0b0cf9117b/sensors-21-06226-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/2ac1cc8c0298/sensors-21-06226-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/a98d294a3f33/sensors-21-06226-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/ee0b12df21a8/sensors-21-06226-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/3548b65d91d7/sensors-21-06226-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/64baf86df0f2/sensors-21-06226-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/624b3a537cff/sensors-21-06226-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/5c565a6082d3/sensors-21-06226-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/b3bcd4738424/sensors-21-06226-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/beb48dbc37b9/sensors-21-06226-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/7e0b0cf9117b/sensors-21-06226-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/2ac1cc8c0298/sensors-21-06226-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/a98d294a3f33/sensors-21-06226-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a7da/8471562/ee0b12df21a8/sensors-21-06226-g010.jpg

相似文献

1
Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments.用于无人机在户外低光照环境下自主着陆的实时单目视觉系统
Sensors (Basel). 2021 Sep 16;21(18):6226. doi: 10.3390/s21186226.
2
Autonomous Landing of Quadrotor Unmanned Aerial Vehicles Based on Multi-Level Marker and Linear Active Disturbance Reject Control.基于多级标记和线性自抗扰控制的四旋翼无人机自主着陆
Sensors (Basel). 2024 Mar 2;24(5):1645. doi: 10.3390/s24051645.
3
Vision-Based Autonomous Following of a Moving Platform and Landing for an Unmanned Aerial Vehicle.基于视觉的无人机自主跟随移动平台和着陆
Sensors (Basel). 2023 Jan 11;23(2):829. doi: 10.3390/s23020829.
4
Autonomous Vision-Based Aerial Grasping for Rotorcraft Unmanned Aerial Vehicles.基于视觉的旋翼机无人机自主空中抓取
Sensors (Basel). 2019 Aug 3;19(15):3410. doi: 10.3390/s19153410.
5
An Onboard Vision-Based System for Autonomous Landing of a Low-Cost Quadrotor on a Novel Landing Pad.基于机载视觉的低成本四旋翼飞行器在新型着陆垫上自主着陆系统。
Sensors (Basel). 2019 Oct 29;19(21):4703. doi: 10.3390/s19214703.
6
Artificial Marker and MEMS IMU-Based Pose Estimation Method to Meet Multirotor UAV Landing Requirements.基于人工标记和微机电系统惯性测量单元的位姿估计方法,以满足多旋翼无人机着陆要求。
Sensors (Basel). 2019 Dec 9;19(24):5428. doi: 10.3390/s19245428.
7
LightDenseYOLO: A Fast and Accurate Marker Tracker for Autonomous UAV Landing by Visible Light Camera Sensor on Drone.基于无人机可见光相机传感器的轻量密集型 YOLO:一种快速准确的自主无人机着陆标记跟踪器。
Sensors (Basel). 2018 May 24;18(6):1703. doi: 10.3390/s18061703.
8
Monocular-Vision-Based Precise Runway Detection Applied to State Estimation for Carrier-Based UAV Landing.基于单目视觉的精确跑道检测在舰载无人机着陆状态估计中的应用
Sensors (Basel). 2022 Nov 1;22(21):8385. doi: 10.3390/s22218385.
9
Visual Servoing Approach to Autonomous UAV Landing on a Moving Vehicle.基于视觉伺服的无人机自主降落在移动车辆上的方法。
Sensors (Basel). 2022 Aug 30;22(17):6549. doi: 10.3390/s22176549.
10
Vision-based safe autonomous UAV docking with panoramic sensors.基于视觉的无人机与全景传感器的安全自主对接
Front Robot AI. 2023 Nov 23;10:1223157. doi: 10.3389/frobt.2023.1223157. eCollection 2023.

引用本文的文献

1
Three-Dimensional Landing Zone Segmentation in Urbanized Aerial Images from Depth Information Using a Deep Neural Network-Superpixel Approach.使用深度神经网络-超像素方法从深度信息对城市化航空图像中的三维着陆区进行分割
Sensors (Basel). 2025 Apr 17;25(8):2517. doi: 10.3390/s25082517.
2
Autonomous Landing Strategy for Micro-UAV with Mirrored Field-of-View Expansion.具有镜像视场扩展的微型无人机自主着陆策略
Sensors (Basel). 2024 Oct 27;24(21):6889. doi: 10.3390/s24216889.
3
Vision-Based UAV Detection and Localization to Indoor Positioning System.

本文引用的文献

1
Optical Navigation Sensor for Runway Relative Positioning of Aircraft during Final Approach.飞机进近最后阶段跑道相对定位用光学导航传感器
Sensors (Basel). 2021 Mar 21;21(6):2203. doi: 10.3390/s21062203.
2
Altitude Measurement-Based Optimization of the Landing Process of UAVs.基于海拔测量的无人机着陆过程优化。
Sensors (Basel). 2021 Feb 6;21(4):1151. doi: 10.3390/s21041151.
3
Precision Landing Test and Simulation of the Agricultural UAV on Apron.机场跑道上农业无人机的精确定位降落测试与模拟
基于视觉的无人机检测与定位应用于室内定位系统
Sensors (Basel). 2024 Jun 25;24(13):4121. doi: 10.3390/s24134121.
4
Monocular-Vision-Based Precise Runway Detection Applied to State Estimation for Carrier-Based UAV Landing.基于单目视觉的精确跑道检测在舰载无人机着陆状态估计中的应用
Sensors (Basel). 2022 Nov 1;22(21):8385. doi: 10.3390/s22218385.
5
Visual Landing Based on the Human Depth Perception in Limited Visibility and Failure of Avionic Systems.基于人类在有限能见度和航空电子系统故障下的深度知觉的视觉着陆。
Comput Intell Neurosci. 2022 Apr 22;2022:4320101. doi: 10.1155/2022/4320101. eCollection 2022.
6
UWB and IMU-Based UAV's Assistance System for Autonomous Landing on a Platform.基于超宽带和惯性测量单元的无人机平台自主着陆辅助系统
Sensors (Basel). 2022 Mar 18;22(6):2347. doi: 10.3390/s22062347.
Sensors (Basel). 2020 Jun 14;20(12):3369. doi: 10.3390/s20123369.
4
Artificial Marker and MEMS IMU-Based Pose Estimation Method to Meet Multirotor UAV Landing Requirements.基于人工标记和微机电系统惯性测量单元的位姿估计方法,以满足多旋翼无人机着陆要求。
Sensors (Basel). 2019 Dec 9;19(24):5428. doi: 10.3390/s19245428.
5
LightDenseYOLO: A Fast and Accurate Marker Tracker for Autonomous UAV Landing by Visible Light Camera Sensor on Drone.基于无人机可见光相机传感器的轻量密集型 YOLO:一种快速准确的自主无人机着陆标记跟踪器。
Sensors (Basel). 2018 May 24;18(6):1703. doi: 10.3390/s18061703.
6
Localization Framework for Real-Time UAV Autonomous Landing: An On-Ground Deployed Visual Approach.实时无人机自主着陆的定位框架:一种地面部署的视觉方法。
Sensors (Basel). 2017 Jun 19;17(6):1437. doi: 10.3390/s17061437.
7
A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.一种用于无人机在GPS信号缺失环境下自动着陆的地面近红外相机阵列系统。
Sensors (Basel). 2016 Aug 30;16(9):1393. doi: 10.3390/s16091393.
8
Single Image Haze Removal Using Dark Channel Prior.基于暗通道先验的单幅图像去雾。
IEEE Trans Pattern Anal Mach Intell. 2011 Dec;33(12):2341-53. doi: 10.1109/TPAMI.2010.168. Epub 2010 Sep 9.