• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于计算机视觉的无人机降落中人体检测技术

UAV Landing Using Computer Vision Techniques for Human Detection.

机构信息

School of Technology and Management, Computer Science and Communication Research Centre, Polytechnic Institute of Leiria, Campus 2, Morro do Lena - Alto do Vieiro, Apartado 4163, 2411-901 Leiria, Portugal.

INESC TEC and University of Trás-os-Montes e Alto Douro, Quinta de Prados, 5001-801 Vila Real, Portugal.

出版信息

Sensors (Basel). 2020 Jan 22;20(3):613. doi: 10.3390/s20030613.

DOI:10.3390/s20030613
PMID:31979142
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7037756/
Abstract

The capability of drones to perform autonomous missions has led retail companies to use them for deliveries, saving time and human resources. In these services, the delivery depends on the Global Positioning System (GPS) to define an approximate landing point. However, the landscape can interfere with the satellite signal (e.g., tall buildings), reducing the accuracy of this approach. Changes in the environment can also invalidate the security of a previously defined landing site (e.g., irregular terrain, swimming pool). Therefore, the main goal of this work is to improve the process of goods delivery using drones, focusing on the detection of the potential receiver. We developed a solution that has been improved along its iterative assessment composed of five test scenarios. The built prototype complements the GPS through Computer Vision (CV) algorithms, based on Convolutional Neural Networks (CNN), running in a Raspberry Pi 3 with a Pi NoIR Camera (i.e., No InfraRed-without infrared filter). The experiments were performed with the models Single Shot Detector (SSD) MobileNet-V2, and SSDLite-MobileNet-V2. The best results were obtained in the afternoon, with the SSDLite architecture, for distances and heights between 2.5-10 m, with recalls from 59%-76%. The results confirm that a low computing power and cost-effective system can perform aerial human detection, estimating the landing position without an additional visual marker.

摘要

无人机执行自主任务的能力使得零售公司能够将其用于送货,从而节省时间和人力资源。在这些服务中,送货依赖于全球定位系统(GPS)来定义一个大致的着陆点。然而,地形可能会干扰卫星信号(例如,高楼大厦),从而降低这种方法的准确性。环境的变化也可能使先前定义的着陆点的安全性失效(例如,不规则地形、游泳池)。因此,这项工作的主要目标是改进使用无人机进行货物交付的过程,重点是检测潜在的接收者。我们开发了一种解决方案,该方案在其迭代评估过程中得到了改进,包括五个测试场景。构建的原型通过基于卷积神经网络(CNN)的计算机视觉(CV)算法来补充 GPS,该算法在配备 Pi NoIR 相机(即没有红外滤光片的红外)的 Raspberry Pi 3 上运行。使用单镜头检测器(SSD)MobileNet-V2 和 SSDLite-MobileNet-V2 模型进行了实验。在下午,SSDLite 架构在 2.5-10 米的距离和高度下,召回率为 59%-76%,获得了最佳结果。结果证实,一个低计算能力和具有成本效益的系统可以执行空中人体检测,无需额外的视觉标记即可估计着陆位置。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/fd614f2d0bcd/sensors-20-00613-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/61657aa95d7f/sensors-20-00613-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/e1bd913a97d8/sensors-20-00613-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/e917432c7221/sensors-20-00613-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/66a03f5ce173/sensors-20-00613-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/c4495ca29838/sensors-20-00613-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/a927095d6861/sensors-20-00613-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/e7a6d842b52c/sensors-20-00613-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/63282cc7ca73/sensors-20-00613-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/c0397b2ea026/sensors-20-00613-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/9d4bcc3dca02/sensors-20-00613-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/effcd4f9f1ca/sensors-20-00613-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/4044cb5cd6ff/sensors-20-00613-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/7974aade074e/sensors-20-00613-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/7abcd7a4c27c/sensors-20-00613-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/eac9b5f9f6b4/sensors-20-00613-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/44115088e87a/sensors-20-00613-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/75f239b28252/sensors-20-00613-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/64b7e16fdbf9/sensors-20-00613-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/09c2ae02f478/sensors-20-00613-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/fd614f2d0bcd/sensors-20-00613-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/61657aa95d7f/sensors-20-00613-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/e1bd913a97d8/sensors-20-00613-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/e917432c7221/sensors-20-00613-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/66a03f5ce173/sensors-20-00613-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/c4495ca29838/sensors-20-00613-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/a927095d6861/sensors-20-00613-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/e7a6d842b52c/sensors-20-00613-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/63282cc7ca73/sensors-20-00613-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/c0397b2ea026/sensors-20-00613-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/9d4bcc3dca02/sensors-20-00613-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/effcd4f9f1ca/sensors-20-00613-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/4044cb5cd6ff/sensors-20-00613-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/7974aade074e/sensors-20-00613-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/7abcd7a4c27c/sensors-20-00613-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/eac9b5f9f6b4/sensors-20-00613-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/44115088e87a/sensors-20-00613-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/75f239b28252/sensors-20-00613-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/64b7e16fdbf9/sensors-20-00613-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/09c2ae02f478/sensors-20-00613-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6688/7037756/fd614f2d0bcd/sensors-20-00613-g020.jpg

相似文献

1
UAV Landing Using Computer Vision Techniques for Human Detection.基于计算机视觉的无人机降落中人体检测技术
Sensors (Basel). 2020 Jan 22;20(3):613. doi: 10.3390/s20030613.
2
Using Deep Learning and Low-Cost RGB and Thermal Cameras to Detect Pedestrians in Aerial Images Captured by Multirotor UAV.利用深度学习以及低成本的 RGB 和热成像摄像机,检测多旋翼无人机航拍图像中的行人。
Sensors (Basel). 2018 Jul 12;18(7):2244. doi: 10.3390/s18072244.
3
Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor.基于远程标记的无人机可见光相机传感器着陆跟踪
Sensors (Basel). 2017 Aug 30;17(9):1987. doi: 10.3390/s17091987.
4
Unmanned aerial vehicles for surveying marine fauna: assessing detection probability.用于调查海洋动物群的无人机:评估检测概率。
Ecol Appl. 2017 Jun;27(4):1253-1267. doi: 10.1002/eap.1519. Epub 2017 Apr 17.
5
LightDenseYOLO: A Fast and Accurate Marker Tracker for Autonomous UAV Landing by Visible Light Camera Sensor on Drone.基于无人机可见光相机传感器的轻量密集型 YOLO:一种快速准确的自主无人机着陆标记跟踪器。
Sensors (Basel). 2018 May 24;18(6):1703. doi: 10.3390/s18061703.
6
Application of Deep-Learning Methods to Bird Detection Using Unmanned Aerial Vehicle Imagery.应用深度学习方法对无人机图像中的鸟类进行检测。
Sensors (Basel). 2019 Apr 6;19(7):1651. doi: 10.3390/s19071651.
7
DeepBrain: Experimental Evaluation of Cloud-Based Computation Offloading and Edge Computing in the Internet-of-Drones for Deep Learning Applications.深脑:基于云的计算卸载和边缘计算在深度学习应用的无人机互联网中的实验评估。
Sensors (Basel). 2020 Sep 14;20(18):5240. doi: 10.3390/s20185240.
8
Precision Landing Test and Simulation of the Agricultural UAV on Apron.机场跑道上农业无人机的精确定位降落测试与模拟
Sensors (Basel). 2020 Jun 14;20(12):3369. doi: 10.3390/s20123369.
9
A Ground-Based Near Infrared Camera Array System for UAV Auto-Landing in GPS-Denied Environment.一种用于无人机在GPS信号缺失环境下自动着陆的地面近红外相机阵列系统。
Sensors (Basel). 2016 Aug 30;16(9):1393. doi: 10.3390/s16091393.
10
Convolutional Neural Networks enable efficient, accurate and fine-grained segmentation of plant species and communities from high-resolution UAV imagery.卷积神经网络可实现从高分辨率无人机图像中对植物物种和群落进行高效、准确和精细的分割。
Sci Rep. 2019 Nov 27;9(1):17656. doi: 10.1038/s41598-019-53797-9.

引用本文的文献

1
Vision-based safe autonomous UAV docking with panoramic sensors.基于视觉的无人机与全景传感器的安全自主对接
Front Robot AI. 2023 Nov 23;10:1223157. doi: 10.3389/frobt.2023.1223157. eCollection 2023.
2
Distributed Architecture for Unmanned Vehicle Services.用于无人驾驶车辆服务的分布式架构。
Sensors (Basel). 2021 Feb 20;21(4):1477. doi: 10.3390/s21041477.

本文引用的文献

1
A Study on the Detection of Cattle in UAV Images Using Deep Learning.基于深度学习的无人机图像中牛的检测研究。
Sensors (Basel). 2019 Dec 10;19(24):5436. doi: 10.3390/s19245436.
2
LightDenseYOLO: A Fast and Accurate Marker Tracker for Autonomous UAV Landing by Visible Light Camera Sensor on Drone.基于无人机可见光相机传感器的轻量密集型 YOLO:一种快速准确的自主无人机着陆标记跟踪器。
Sensors (Basel). 2018 May 24;18(6):1703. doi: 10.3390/s18061703.
3
Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor.基于远程标记的无人机可见光相机传感器着陆跟踪
Sensors (Basel). 2017 Aug 30;17(9):1987. doi: 10.3390/s17091987.
4
Using Deep Learning for Image-Based Plant Disease Detection.利用深度学习进行基于图像的植物病害检测。
Front Plant Sci. 2016 Sep 22;7:1419. doi: 10.3389/fpls.2016.01419. eCollection 2016.