• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于无人机可见光相机传感器的轻量密集型 YOLO:一种快速准确的自主无人机着陆标记跟踪器。

LightDenseYOLO: A Fast and Accurate Marker Tracker for Autonomous UAV Landing by Visible Light Camera Sensor on Drone.

机构信息

Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 100-715, Korea.

出版信息

Sensors (Basel). 2018 May 24;18(6):1703. doi: 10.3390/s18061703.

DOI:10.3390/s18061703
PMID:29795038
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6022018/
Abstract

Autonomous landing of an unmanned aerial vehicle or a drone is a challenging problem for the robotics research community. Previous researchers have attempted to solve this problem by combining multiple sensors such as global positioning system (GPS) receivers, inertial measurement unit, and multiple camera systems. Although these approaches successfully estimate an unmanned aerial vehicle location during landing, many calibration processes are required to achieve good detection accuracy. In addition, cases where drones operate in heterogeneous areas with no GPS signal should be considered. To overcome these problems, we determined how to safely land a drone in a GPS-denied environment using our remote-marker-based tracking algorithm based on a single visible-light-camera sensor. Instead of using hand-crafted features, our algorithm includes a convolutional neural network named lightDenseYOLO to extract trained features from an input image to predict a marker's location by visible light camera sensor on drone. Experimental results show that our method significantly outperforms state-of-the-art object trackers both using and not using convolutional neural network in terms of both accuracy and processing time.

摘要

无人机或无人驾驶飞行器的自主着陆是机器人研究领域的一个具有挑战性的问题。以前的研究人员试图通过结合多个传感器,如全球定位系统(GPS)接收器、惯性测量单元和多个摄像系统来解决这个问题。尽管这些方法在着陆过程中成功地估计了无人驾驶飞行器的位置,但需要进行许多校准过程才能达到良好的检测精度。此外,还应该考虑到无人机在没有 GPS 信号的异构区域中运行的情况。为了克服这些问题,我们决定使用我们基于远程标记的跟踪算法,该算法基于单个可见光摄像机传感器,在 GPS 受限制的环境中安全地降落无人机。我们的算法没有使用手工制作的特征,而是包含了一个名为 lightDenseYOLO 的卷积神经网络,该网络从输入图像中提取训练好的特征,通过无人机上的可见光摄像机传感器来预测标记的位置。实验结果表明,我们的方法在精度和处理时间方面都明显优于使用和不使用卷积神经网络的最先进的目标跟踪器。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/4d9dc453b991/sensors-18-01703-g021a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/276bdaf07e49/sensors-18-01703-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/25b87b622193/sensors-18-01703-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/abc7d3a6ad99/sensors-18-01703-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/9e8549ae5844/sensors-18-01703-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/c46abb379898/sensors-18-01703-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/4172faa300b3/sensors-18-01703-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/2ae85cebe3e8/sensors-18-01703-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/f5ee8297bae8/sensors-18-01703-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/4c35de4377ba/sensors-18-01703-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/6fdfb636f498/sensors-18-01703-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/a8619729186c/sensors-18-01703-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/ec9256c7bf4d/sensors-18-01703-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/77b6fe39d927/sensors-18-01703-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/7e9c0545926d/sensors-18-01703-g014a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/30ca556aa5ca/sensors-18-01703-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/dc0f104c2422/sensors-18-01703-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/5fd48bf2b60b/sensors-18-01703-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/ddd75e8490be/sensors-18-01703-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/ef945d6ba34b/sensors-18-01703-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/aa2e22687238/sensors-18-01703-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/4d9dc453b991/sensors-18-01703-g021a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/276bdaf07e49/sensors-18-01703-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/25b87b622193/sensors-18-01703-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/abc7d3a6ad99/sensors-18-01703-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/9e8549ae5844/sensors-18-01703-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/c46abb379898/sensors-18-01703-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/4172faa300b3/sensors-18-01703-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/2ae85cebe3e8/sensors-18-01703-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/f5ee8297bae8/sensors-18-01703-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/4c35de4377ba/sensors-18-01703-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/6fdfb636f498/sensors-18-01703-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/a8619729186c/sensors-18-01703-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/ec9256c7bf4d/sensors-18-01703-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/77b6fe39d927/sensors-18-01703-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/7e9c0545926d/sensors-18-01703-g014a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/30ca556aa5ca/sensors-18-01703-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/dc0f104c2422/sensors-18-01703-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/5fd48bf2b60b/sensors-18-01703-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/ddd75e8490be/sensors-18-01703-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/ef945d6ba34b/sensors-18-01703-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/aa2e22687238/sensors-18-01703-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/50c9/6022018/4d9dc453b991/sensors-18-01703-g021a.jpg

相似文献

1
LightDenseYOLO: A Fast and Accurate Marker Tracker for Autonomous UAV Landing by Visible Light Camera Sensor on Drone.基于无人机可见光相机传感器的轻量密集型 YOLO:一种快速准确的自主无人机着陆标记跟踪器。
Sensors (Basel). 2018 May 24;18(6):1703. doi: 10.3390/s18061703.
2
Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor.基于远程标记的无人机可见光相机传感器着陆跟踪
Sensors (Basel). 2017 Aug 30;17(9):1987. doi: 10.3390/s17091987.
3
Precision Landing of a Quadcopter Drone by Smartphone Video Guidance Sensor in a GPS-Denied Environment.在 GPS 受限制环境下,通过智能手机视频引导传感器实现四旋翼无人机的精确定位降落。
Sensors (Basel). 2023 Feb 9;23(4):1934. doi: 10.3390/s23041934.
4
A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.一种用于无人机在GPS信号缺失环境下自动着陆的地面近红外相机阵列系统。
Sensors (Basel). 2016 Aug 30;16(9):1393. doi: 10.3390/s16091393.
5
SlimDeblurGAN-Based Motion Deblurring and Marker Detection for Autonomous Drone Landing.基于 SlimDeblurGAN 的运动去模糊和标记检测的自主无人机着陆。
Sensors (Basel). 2020 Jul 14;20(14):3918. doi: 10.3390/s20143918.
6
UAV Autonomous Tracking and Landing Based on Deep Reinforcement Learning Strategy.基于深度强化学习策略的无人机自主跟踪与着陆
Sensors (Basel). 2020 Oct 1;20(19):5630. doi: 10.3390/s20195630.
7
Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm.迈向自主模块化无人机任务:检测、地理定位与着陆范例
Sensors (Basel). 2016 Nov 3;16(11):1844. doi: 10.3390/s16111844.
8
Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments.用于无人机在户外低光照环境下自主着陆的实时单目视觉系统
Sensors (Basel). 2021 Sep 16;21(18):6226. doi: 10.3390/s21186226.
9
Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environments Using Multi-Sensor Data Fusion.基于多传感器数据融合的无人机在多环境中的实时机载三维状态估计
Sensors (Basel). 2020 Feb 9;20(3):919. doi: 10.3390/s20030919.
10
VIAE-Net: An End-to-End Altitude Estimation through Monocular Vision and Inertial Feature Fusion Neural Networks for UAV Autonomous Landing.VIAE-Net:一种基于单目视觉和惯性特征融合神经网络的端到端无人机自主着陆高度估计方法。
Sensors (Basel). 2021 Sep 20;21(18):6302. doi: 10.3390/s21186302.

引用本文的文献

1
Three-Dimensional Landing Zone Segmentation in Urbanized Aerial Images from Depth Information Using a Deep Neural Network-Superpixel Approach.使用深度神经网络-超像素方法从深度信息对城市化航空图像中的三维着陆区进行分割
Sensors (Basel). 2025 Apr 17;25(8):2517. doi: 10.3390/s25082517.
2
Simulation and real-life implementation of UAV autonomous landing system based on object recognition and tracking for safe landing in uncertain environments.基于目标识别与跟踪的无人机自主着陆系统在不确定环境下安全着陆的仿真与实际应用
Front Robot AI. 2024 Oct 18;11:1450266. doi: 10.3389/frobt.2024.1450266. eCollection 2024.
3
Comprehensive Investigation of Unmanned Aerial Vehicles (UAVs): An In-Depth Analysis of Avionics Systems.

本文引用的文献

1
Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor.基于远程标记的无人机可见光相机传感器着陆跟踪
Sensors (Basel). 2017 Aug 30;17(9):1987. doi: 10.3390/s17091987.
2
A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.一种用于无人机在GPS信号缺失环境下自动着陆的地面近红外相机阵列系统。
Sensors (Basel). 2016 Aug 30;16(9):1393. doi: 10.3390/s16091393.
3
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks.更快的 R-CNN:基于区域建议网络的实时目标检测。
无人机综合研究:航空电子系统的深入分析
Sensors (Basel). 2024 May 11;24(10):3064. doi: 10.3390/s24103064.
4
UWB and IMU-Based UAV's Assistance System for Autonomous Landing on a Platform.基于超宽带和惯性测量单元的无人机平台自主着陆辅助系统
Sensors (Basel). 2022 Mar 18;22(6):2347. doi: 10.3390/s22062347.
5
Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments.用于无人机在户外低光照环境下自主着陆的实时单目视觉系统
Sensors (Basel). 2021 Sep 16;21(18):6226. doi: 10.3390/s21186226.
6
Exploring Fast Fingerprint Construction Algorithm for Unmodulated Visible Light Indoor Localization.探索用于无调制可见室内定位的快速指纹构建算法。
Sensors (Basel). 2020 Dec 17;20(24):7245. doi: 10.3390/s20247245.
7
SlimDeblurGAN-Based Motion Deblurring and Marker Detection for Autonomous Drone Landing.基于 SlimDeblurGAN 的运动去模糊和标记检测的自主无人机着陆。
Sensors (Basel). 2020 Jul 14;20(14):3918. doi: 10.3390/s20143918.
8
UAV Landing Using Computer Vision Techniques for Human Detection.基于计算机视觉的无人机降落中人体检测技术
Sensors (Basel). 2020 Jan 22;20(3):613. doi: 10.3390/s20030613.
9
Artificial Marker and MEMS IMU-Based Pose Estimation Method to Meet Multirotor UAV Landing Requirements.基于人工标记和微机电系统惯性测量单元的位姿估计方法,以满足多旋翼无人机着陆要求。
Sensors (Basel). 2019 Dec 9;19(24):5428. doi: 10.3390/s19245428.
10
Motion Estimation by Hybrid Optical Flow Technology for UAV Landing in an Unvisited Area.基于混合光流技术的无人机在未知区域着陆的运动估计。
Sensors (Basel). 2019 Mar 20;19(6):1380. doi: 10.3390/s19061380.
IEEE Trans Pattern Anal Mach Intell. 2017 Jun;39(6):1137-1149. doi: 10.1109/TPAMI.2016.2577031. Epub 2016 Jun 6.
4
What Makes for Effective Detection Proposals?什么因素能促成有效的检测提议?
IEEE Trans Pattern Anal Mach Intell. 2016 Apr;38(4):814-30. doi: 10.1109/TPAMI.2015.2465908.
5
High-Speed Tracking with Kernelized Correlation Filters.基于核相关滤波器的高速跟踪。
IEEE Trans Pattern Anal Mach Intell. 2015 Mar;37(3):583-96. doi: 10.1109/TPAMI.2014.2345390.
6
Deep learning in neural networks: an overview.神经网络中的深度学习:综述。
Neural Netw. 2015 Jan;61:85-117. doi: 10.1016/j.neunet.2014.09.003. Epub 2014 Oct 13.
7
Tracking-Learning-Detection.跟踪-学习-检测。
IEEE Trans Pattern Anal Mach Intell. 2012 Jul;34(7):1409-22. doi: 10.1109/TPAMI.2011.239. Epub 2011 Dec 13.
8
Active contours without edges.无边缘活动轮廓。
IEEE Trans Image Process. 2001;10(2):266-77. doi: 10.1109/83.902291.