• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于外观的航空搜索救援跟踪算法。

An Appearance-Based Tracking Algorithm for Aerial Search and Rescue Purposes.

机构信息

Intelligent Systems Lab (LSI), Universidad Carlos III de Madrid, Avnd. de la Universidad 30, 28911 Madrid, Spain.

出版信息

Sensors (Basel). 2019 Feb 5;19(3):652. doi: 10.3390/s19030652.

DOI:10.3390/s19030652
PMID:30764528
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6387277/
Abstract

The automation of the Wilderness Search and Rescue (WiSAR) task aims for high levels of understanding of various scenery. In addition, working in unfriendly and complex environments may cause a time delay in the operation and consequently put human lives at stake. In order to address this problem, Unmanned Aerial Vehicles (UAVs), which provide potential support to the conventional methods, are used. These vehicles are provided with reliable human detection and tracking algorithms; in order to be able to find and track the bodies of the victims in complex environments, and a robust control system to maintain safe distances from the detected bodies. In this paper, a human detection based on the color and depth data captured from onboard sensors is proposed. Moreover, the proposal of computing data association from the skeleton pose and a visual appearance measurement allows the tracking of multiple people with invariance to the scale, translation and rotation of the point of view with respect to the target objects. The system has been validated with real and simulation experiments, and the obtained results show the ability to track multiple individuals even after long-term disappearances. Furthermore, the simulations present the robustness of the implemented reactive control system as a promising tool for assisting the pilot to perform approaching maneuvers in a safe and smooth manner.

摘要

荒野搜索和救援 (WiSAR) 任务的自动化旨在实现对各种场景的高度理解。此外,在不友好和复杂的环境中工作可能会导致操作时间延迟,从而危及人类生命。为了解决这个问题,使用了提供给传统方法的潜在支持的无人机 (UAV)。这些车辆配备了可靠的人体检测和跟踪算法;为了能够在复杂环境中找到和跟踪受害者的尸体,并配备稳健的控制系统,以保持与检测到的尸体的安全距离。在本文中,提出了一种基于机载传感器捕获的颜色和深度数据的人体检测方法。此外,通过骨架姿势和视觉外观测量来计算数据关联的建议,允许对多个人员进行跟踪,而与目标对象的视角的比例、平移和旋转无关。该系统已经通过真实和模拟实验进行了验证,所获得的结果表明,即使在长时间消失后,该系统也能够跟踪多个个体。此外,模拟结果展示了所实现的反应控制系统的稳健性,这是一种有前途的工具,可帮助飞行员以安全、平稳的方式执行接近机动。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/59bfabb6f57b/sensors-19-00652-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/0e976992e673/sensors-19-00652-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/391b13a6f0c0/sensors-19-00652-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/be7b9b174cb1/sensors-19-00652-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/71239c7f865e/sensors-19-00652-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/a3668ec44167/sensors-19-00652-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/3ff6be94b627/sensors-19-00652-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/f9daa9dd8fc8/sensors-19-00652-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/c411d5864451/sensors-19-00652-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/02ff19e06893/sensors-19-00652-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/f87c5b364906/sensors-19-00652-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/3222a91eca1a/sensors-19-00652-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/3433bb5feb25/sensors-19-00652-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/85936f50c591/sensors-19-00652-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/ea75be2f58ad/sensors-19-00652-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/38cec6c27bdc/sensors-19-00652-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/68c9e43f5437/sensors-19-00652-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/7e90b516a856/sensors-19-00652-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/d95d25faa0f5/sensors-19-00652-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/a41a8e1c0801/sensors-19-00652-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/59bfabb6f57b/sensors-19-00652-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/0e976992e673/sensors-19-00652-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/391b13a6f0c0/sensors-19-00652-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/be7b9b174cb1/sensors-19-00652-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/71239c7f865e/sensors-19-00652-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/a3668ec44167/sensors-19-00652-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/3ff6be94b627/sensors-19-00652-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/f9daa9dd8fc8/sensors-19-00652-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/c411d5864451/sensors-19-00652-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/02ff19e06893/sensors-19-00652-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/f87c5b364906/sensors-19-00652-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/3222a91eca1a/sensors-19-00652-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/3433bb5feb25/sensors-19-00652-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/85936f50c591/sensors-19-00652-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/ea75be2f58ad/sensors-19-00652-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/38cec6c27bdc/sensors-19-00652-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/68c9e43f5437/sensors-19-00652-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/7e90b516a856/sensors-19-00652-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/d95d25faa0f5/sensors-19-00652-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/a41a8e1c0801/sensors-19-00652-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/aeea/6387277/59bfabb6f57b/sensors-19-00652-g020.jpg

相似文献

1
An Appearance-Based Tracking Algorithm for Aerial Search and Rescue Purposes.基于外观的航空搜索救援跟踪算法。
Sensors (Basel). 2019 Feb 5;19(3):652. doi: 10.3390/s19030652.
2
Autonomous Vision-Based Aerial Grasping for Rotorcraft Unmanned Aerial Vehicles.基于视觉的旋翼机无人机自主空中抓取
Sensors (Basel). 2019 Aug 3;19(15):3410. doi: 10.3390/s19153410.
3
A Camera-Based Target Detection and Positioning UAV System for Search and Rescue (SAR) Purposes.一种用于搜索和救援(SAR)目的的基于摄像头的目标检测与定位无人机系统。
Sensors (Basel). 2016 Oct 25;16(11):1778. doi: 10.3390/s16111778.
4
Coordinated Target Tracking via a Hybrid Optimization Approach.基于混合优化方法的协同目标跟踪
Sensors (Basel). 2017 Feb 27;17(3):472. doi: 10.3390/s17030472.
5
Unmanned Aerial Vehicle Object Tracking by Correlation Filter with Adaptive Appearance Model.基于自适应外观模型的相关滤波的无人机目标跟踪。
Sensors (Basel). 2018 Aug 21;18(9):2751. doi: 10.3390/s18092751.
6
Onboard Robust Visual Tracking for UAVs Using a Reliable Global-Local Object Model.基于可靠全局-局部目标模型的无人机机载鲁棒视觉跟踪
Sensors (Basel). 2016 Aug 31;16(9):1406. doi: 10.3390/s16091406.
7
Dynamic Object Tracking on Autonomous UAV System for Surveillance Applications.自主无人机系统上的动态目标跟踪用于监控应用。
Sensors (Basel). 2021 Nov 27;21(23):7888. doi: 10.3390/s21237888.
8
Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor.基于远程标记的无人机可见光相机传感器着陆跟踪
Sensors (Basel). 2017 Aug 30;17(9):1987. doi: 10.3390/s17091987.
9
Autonomous Unmanned Aerial Vehicles in Search and Rescue Missions Using Real-Time Cooperative Model Predictive Control.自主式无人机在搜救任务中的应用——基于实时协同模型预测控制
Sensors (Basel). 2019 Sep 20;19(19):4067. doi: 10.3390/s19194067.
10
Formation Flight of Multiple UAVs via Onboard Sensor Information Sharing.基于机载传感器信息共享的多无人机编队飞行
Sensors (Basel). 2015 Jul 17;15(7):17397-419. doi: 10.3390/s150717397.

引用本文的文献

1
Unmanned aerial vehicle based intelligent triage system in mass-casualty incidents using 5G and artificial intelligence.基于无人机的5G和人工智能在大规模伤亡事件中的智能分诊系统。
World J Emerg Med. 2023;14(4):273-279. doi: 10.5847/wjem.j.1920-8642.2023.066.
2
Dynamic Object Tracking on Autonomous UAV System for Surveillance Applications.自主无人机系统上的动态目标跟踪用于监控应用。
Sensors (Basel). 2021 Nov 27;21(23):7888. doi: 10.3390/s21237888.
3
Fuzzy Logic for Intelligent Control System Using Soft Computing Applications.模糊逻辑在智能控制系统中的应用-软计算应用

本文引用的文献

1
Fuzzy Logic Based Control for Autonomous Mobile Robot Navigation.基于模糊逻辑的自主移动机器人导航控制
Comput Intell Neurosci. 2016;2016:9548482. doi: 10.1155/2016/9548482. Epub 2016 Sep 5.
2
MCMC-based particle filtering for tracking a variable number of interacting targets.基于马尔可夫链蒙特卡罗的粒子滤波用于跟踪可变数量的相互作用目标。
IEEE Trans Pattern Anal Mach Intell. 2005 Nov;27(11):1805-19. doi: 10.1109/TPAMI.2005.223.
Sensors (Basel). 2021 Apr 8;21(8):2617. doi: 10.3390/s21082617.
4
Real-Time Human Detection and Gesture Recognition for On-Board UAV Rescue.实时人体检测与手势识别在机载无人机救援中的应用。
Sensors (Basel). 2021 Mar 20;21(6):2180. doi: 10.3390/s21062180.
5
Unsupervised Human Detection with an Embedded Vision System on a Fully Autonomous UAV for Search and Rescue Operations.基于嵌入式视觉系统的全自主无人机无监督人体检测在搜索救援行动中的应用。
Sensors (Basel). 2019 Aug 14;19(16):3542. doi: 10.3390/s19163542.