• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用 Tello EDU 无人机进行实时人体运动跟踪。

Real-Time Human Motion Tracking by Tello EDU Drone.

机构信息

Department of Electrical Engineering, Faculty of Engineering, Burapha University Chonburi Campus, Chonburi 20131, Thailand.

出版信息

Sensors (Basel). 2023 Jan 12;23(2):897. doi: 10.3390/s23020897.

DOI:10.3390/s23020897
PMID:36679699
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9860987/
Abstract

Human movement tracking is useful in a variety of areas, such as search-and-rescue activities. CCTV and IP cameras are popular as front-end sensors for tracking human motion; however, they are stationary and have limited applicability in hard-to-reach places, such as those where disasters have occurred. Using a drone to discover a person is challenging and requires an innovative approach. In this paper, we aim to present the design and implementation of a human motion tracking method using a Tello EDU drone. The design methodology is carried out in four steps: (1) control panel design; (2) human motion tracking algorithm; (3) notification systems; and (4) communication and distance extension. Intensive experimental results show that the drone implemented by the proposed algorithm performs well in tracking a human at a distance of 2-10 m moving at a speed of 2 m/s. In an experimental field of the size 95×35m2, the drone tracked human motion throughout a whole day, with the best tracking results observed in the morning. The drone was controlled from a laptop using a Wi-Fi router with a maximum horizontal tracking distance of 84.30 m and maximum vertical distance of 13.40 m. The experiment showed an accuracy rate for human movement detection between 96.67 and 100%.

摘要

人体运动跟踪在许多领域都很有用,例如搜索和救援活动。闭路电视和 IP 摄像机作为前端传感器,常用于跟踪人体运动;然而,它们是固定的,在难以到达的地方,如发生灾难的地方,适用性有限。使用无人机来发现人是具有挑战性的,需要创新的方法。在本文中,我们旨在提出一种使用 Tello EDU 无人机进行人体运动跟踪的设计和实现方法。设计方法分四步进行:(1)控制面板设计;(2)人体运动跟踪算法;(3)通知系统;(4)通信和距离扩展。大量实验结果表明,所提出算法实现的无人机在跟踪距离为 2-10 米、速度为 2 米/秒的人体时表现良好。在 95×35m2 的实验场地中,无人机全天跟踪人体运动,早上的跟踪效果最佳。无人机通过笔记本电脑使用 Wi-Fi 路由器进行控制,最大水平跟踪距离为 84.30 米,最大垂直跟踪距离为 13.40 米。实验表明,人体运动检测的准确率在 96.67%至 100%之间。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/e76631bd6c5f/sensors-23-00897-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/87f593cb4828/sensors-23-00897-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/d811c67e0b4a/sensors-23-00897-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/37931ed2f4be/sensors-23-00897-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/1d09c356811f/sensors-23-00897-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/0ae52658c354/sensors-23-00897-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/7cb2f425a287/sensors-23-00897-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/c6e2493f1c01/sensors-23-00897-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/055dc7a97fe5/sensors-23-00897-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/4e2e25bbf0d5/sensors-23-00897-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/25a821fd3023/sensors-23-00897-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/c37ca3f13c8c/sensors-23-00897-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/6f2bb59e9b19/sensors-23-00897-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/43ef8531f40b/sensors-23-00897-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/00579dbfcc8e/sensors-23-00897-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/e5faae42bc51/sensors-23-00897-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/f3847e43ea8b/sensors-23-00897-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/904a93c286b8/sensors-23-00897-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/086649bea2b9/sensors-23-00897-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/c00b9acadf78/sensors-23-00897-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/220203df3741/sensors-23-00897-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/e3d44ca913da/sensors-23-00897-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/9550b2a3b472/sensors-23-00897-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/c9986a67cc6c/sensors-23-00897-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/314656be5c52/sensors-23-00897-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/86016d885eb2/sensors-23-00897-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/dafd560ace2f/sensors-23-00897-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/be20aab378c5/sensors-23-00897-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/b39792917418/sensors-23-00897-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/6b944b812f26/sensors-23-00897-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/e76631bd6c5f/sensors-23-00897-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/87f593cb4828/sensors-23-00897-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/d811c67e0b4a/sensors-23-00897-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/37931ed2f4be/sensors-23-00897-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/1d09c356811f/sensors-23-00897-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/0ae52658c354/sensors-23-00897-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/7cb2f425a287/sensors-23-00897-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/c6e2493f1c01/sensors-23-00897-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/055dc7a97fe5/sensors-23-00897-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/4e2e25bbf0d5/sensors-23-00897-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/25a821fd3023/sensors-23-00897-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/c37ca3f13c8c/sensors-23-00897-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/6f2bb59e9b19/sensors-23-00897-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/43ef8531f40b/sensors-23-00897-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/00579dbfcc8e/sensors-23-00897-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/e5faae42bc51/sensors-23-00897-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/f3847e43ea8b/sensors-23-00897-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/904a93c286b8/sensors-23-00897-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/086649bea2b9/sensors-23-00897-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/c00b9acadf78/sensors-23-00897-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/220203df3741/sensors-23-00897-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/e3d44ca913da/sensors-23-00897-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/9550b2a3b472/sensors-23-00897-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/c9986a67cc6c/sensors-23-00897-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/314656be5c52/sensors-23-00897-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/86016d885eb2/sensors-23-00897-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/dafd560ace2f/sensors-23-00897-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/be20aab378c5/sensors-23-00897-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/b39792917418/sensors-23-00897-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/6b944b812f26/sensors-23-00897-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a3f0/9860987/e76631bd6c5f/sensors-23-00897-g031.jpg

相似文献

1
Real-Time Human Motion Tracking by Tello EDU Drone.利用 Tello EDU 无人机进行实时人体运动跟踪。
Sensors (Basel). 2023 Jan 12;23(2):897. doi: 10.3390/s23020897.
2
Fusion Filters between the No Motion No Integration Technique and Kalman Filter in Noise Optimization on a 6DoF Drone for Orientation Tracking.六自由度无人机姿态跟踪中无运动无积分技术与卡尔曼滤波器融合在噪声优化中的应用
Sensors (Basel). 2023 Jun 15;23(12):5603. doi: 10.3390/s23125603.
3
Modifying Hata-Davidson Propagation Model for Remote Sensing in Complex Environments Using a Multifactional Drone.利用多频段无人机修正复杂环境下遥感的 Hata-Davidson 传播模型。
Sensors (Basel). 2022 Feb 24;22(5):1786. doi: 10.3390/s22051786.
4
Applications of drone in disaster management: A scoping review.无人机在灾害管理中的应用:范围综述。
Sci Justice. 2022 Jan;62(1):30-42. doi: 10.1016/j.scijus.2021.11.002. Epub 2021 Nov 14.
5
Sensing spectrum sharing based massive MIMO radar for drone tracking and interception.基于感知频谱共享的大规模 MIMO 雷达用于无人机跟踪和拦截。
PLoS One. 2022 May 20;17(5):e0268834. doi: 10.1371/journal.pone.0268834. eCollection 2022.
6
Visual-inertial hand motion tracking with robustness against occlusion, interference, and contact.具有抗遮挡、抗干扰和抗接触能力的视觉惯性手运动跟踪。
Sci Robot. 2021 Sep 29;6(58):eabe1315. doi: 10.1126/scirobotics.abe1315.
7
Detection and Tracking Meet Drones Challenge.检测与跟踪遭遇无人机挑战。
IEEE Trans Pattern Anal Mach Intell. 2022 Nov;44(11):7380-7399. doi: 10.1109/TPAMI.2021.3119563. Epub 2022 Oct 4.
8
Countering a Drone in a 3D Space: Analyzing Deep Reinforcement Learning Methods.在三维空间中对抗无人机:分析深度强化学习方法。
Sensors (Basel). 2022 Nov 16;22(22):8863. doi: 10.3390/s22228863.
9
An Investigation of the Reliability of Different Types of Sensors in the Real-Time Vibration-Based Anomaly Inspection in Drone.基于无人机实时振动异常检测的不同类型传感器可靠性研究。
Sensors (Basel). 2022 Aug 12;22(16):6015. doi: 10.3390/s22166015.
10
High-Resolution Drone Detection Based on Background Difference and SAG-YOLOv5s.基于背景差分和 SAG-YOLOv5s 的高分辨率无人机检测。
Sensors (Basel). 2022 Aug 4;22(15):5825. doi: 10.3390/s22155825.

本文引用的文献

1
An eight-camera fall detection system using human fall pattern recognition via machine learning by a low-cost android box.一种使用机器学习通过低成本的安卓盒子进行人体跌倒模式识别的八相机跌倒检测系统。
Sci Rep. 2021 Jan 28;11(1):2471. doi: 10.1038/s41598-021-81115-9.
2
ExerSense: Physical Exercise Recognition and Counting Algorithm from Wearables Robust to Positioning.ExerSense:一种稳健的可穿戴设备物理运动识别与计数算法
Sensors (Basel). 2020 Dec 25;21(1):91. doi: 10.3390/s21010091.
3
3D Tracking of Human Motion Using Visual Skeletonization and Stereoscopic Vision.
利用视觉骨骼化和立体视觉进行人体运动的3D跟踪
Front Bioeng Biotechnol. 2020 Mar 5;8:181. doi: 10.3389/fbioe.2020.00181. eCollection 2020.
4
Home Camera-Based Fall Detection System for the Elderly.基于家用摄像头的老年人跌倒检测系统。
Sensors (Basel). 2017 Dec 9;17(12):2864. doi: 10.3390/s17122864.
5
Mobile Health Applications to Promote Active and Healthy Ageing.移动医疗应用促进积极健康老龄化。
Sensors (Basel). 2017 Mar 18;17(3):622. doi: 10.3390/s17030622.
6
A Physical Activity Reference Data-Set Recorded from Older Adults Using Body-Worn Inertial Sensors and Video Technology-The ADAPT Study Data-Set.使用穿戴式惯性传感器和视频技术记录的老年人身体活动参考数据集——ADAPT 研究数据集。
Sensors (Basel). 2017 Mar 10;17(3):559. doi: 10.3390/s17030559.
7
Time-Elastic Generative Model for Acceleration Time Series in Human Activity Recognition.用于人体活动识别中加速度时间序列的时间弹性生成模型。
Sensors (Basel). 2017 Feb 8;17(2):319. doi: 10.3390/s17020319.
8
Estimation of Ground Reaction Forces and Moments During Gait Using Only Inertial Motion Capture.仅使用惯性运动捕捉技术估计步态期间的地面反作用力和力矩。
Sensors (Basel). 2016 Dec 31;17(1):75. doi: 10.3390/s17010075.
9
Association between walking speed and age in healthy, free-living individuals using mobile accelerometry--a cross-sectional study.使用移动加速度计的健康、自由生活个体中行走速度与年龄的关系——一项横断面研究。
PLoS One. 2011;6(8):e23299. doi: 10.1371/journal.pone.0023299. Epub 2011 Aug 10.
10
Understanding and using sensitivity, specificity and predictive values.理解并运用敏感度、特异度和预测值。
Indian J Ophthalmol. 2008 Jan-Feb;56(1):45-50. doi: 10.4103/0301-4738.37595.