• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于可靠全局-局部目标模型的无人机机载鲁棒视觉跟踪

Onboard Robust Visual Tracking for UAVs Using a Reliable Global-Local Object Model.

作者信息

Fu Changhong, Duan Ran, Kircali Dogan, Kayacan Erdal

机构信息

School of Mechanical and Aerospace Engineering, Nanyang Technological University (NTU), 50 Nanyang Avenue, Singapore 639798, Singapore.

ST Engineering-NTU Corporate Laboratory, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798, Singapore.

出版信息

Sensors (Basel). 2016 Aug 31;16(9):1406. doi: 10.3390/s16091406.

DOI:10.3390/s16091406
PMID:27589769
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5038684/
Abstract

In this paper, we present a novel onboard robust visual algorithm for long-term arbitrary 2D and 3D object tracking using a reliable global-local object model for unmanned aerial vehicle (UAV) applications, e.g., autonomous tracking and chasing a moving target. The first main approach in this novel algorithm is the use of a global matching and local tracking approach. In other words, the algorithm initially finds feature correspondences in a way that an improved binary descriptor is developed for global feature matching and an iterative Lucas-Kanade optical flow algorithm is employed for local feature tracking. The second main module is the use of an efficient local geometric filter (LGF), which handles outlier feature correspondences based on a new forward-backward pairwise dissimilarity measure, thereby maintaining pairwise geometric consistency. In the proposed LGF module, a hierarchical agglomerative clustering, i.e., bottom-up aggregation, is applied using an effective single-link method. The third proposed module is a heuristic local outlier factor (to the best of our knowledge, it is utilized for the first time to deal with outlier features in a visual tracking application), which further maximizes the representation of the target object in which we formulate outlier feature detection as a binary classification problem with the output features of the LGF module. Extensive UAV flight experiments show that the proposed visual tracker achieves real-time frame rates of more than thirty-five frames per second on an i7 processor with 640 × 512 image resolution and outperforms the most popular state-of-the-art trackers favorably in terms of robustness, efficiency and accuracy.

摘要

在本文中,我们提出了一种新颖的机载鲁棒视觉算法,用于使用可靠的全局-局部目标模型进行长期任意二维和三维目标跟踪,适用于无人机(UAV)应用,例如自主跟踪和追逐移动目标。这种新颖算法的第一个主要方法是使用全局匹配和局部跟踪方法。换句话说,该算法最初通过开发一种改进的二进制描述符进行全局特征匹配,并采用迭代卢卡斯-卡纳德光流算法进行局部特征跟踪来找到特征对应关系。第二个主要模块是使用高效的局部几何滤波器(LGF),它基于一种新的前后成对差异度量来处理异常特征对应关系,从而保持成对几何一致性。在所提出的LGF模块中,使用有效的单链方法应用层次凝聚聚类,即自底向上聚合。第三个提出的模块是启发式局部异常因子(据我们所知,它首次用于视觉跟踪应用中处理异常特征),它进一步最大化目标对象的表示,其中我们将异常特征检测表述为一个基于LGF模块输出特征的二进制分类问题。广泛的无人机飞行实验表明,所提出的视觉跟踪器在具有(640×512)图像分辨率的i7处理器上实现了每秒超过三十五帧的实时帧率,并且在鲁棒性、效率和准确性方面优于最流行的现有跟踪器。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/f5155af0f633/sensors-16-01406-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/3856798fab64/sensors-16-01406-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/68f3204ddb0e/sensors-16-01406-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/3d718e17cb9b/sensors-16-01406-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/501ac265d8f8/sensors-16-01406-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/76fa1c9bc8de/sensors-16-01406-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/c8e68daa8507/sensors-16-01406-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/11bdbe73e883/sensors-16-01406-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/682aad3a8798/sensors-16-01406-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/4545877b8c09/sensors-16-01406-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/15df8af1e752/sensors-16-01406-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/32aee1202c0a/sensors-16-01406-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/45cadf5096a4/sensors-16-01406-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/36a2c155bc5b/sensors-16-01406-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/33ddf02553e5/sensors-16-01406-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/b5325f03050e/sensors-16-01406-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/10a46ba789bc/sensors-16-01406-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/b4ff0b9af111/sensors-16-01406-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/3a26920b123d/sensors-16-01406-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/36350832f809/sensors-16-01406-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/b0c006a19b4c/sensors-16-01406-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/83a26588491c/sensors-16-01406-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/75eb4e5234ca/sensors-16-01406-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/296ab24cc781/sensors-16-01406-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/f5155af0f633/sensors-16-01406-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/3856798fab64/sensors-16-01406-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/68f3204ddb0e/sensors-16-01406-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/3d718e17cb9b/sensors-16-01406-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/501ac265d8f8/sensors-16-01406-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/76fa1c9bc8de/sensors-16-01406-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/c8e68daa8507/sensors-16-01406-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/11bdbe73e883/sensors-16-01406-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/682aad3a8798/sensors-16-01406-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/4545877b8c09/sensors-16-01406-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/15df8af1e752/sensors-16-01406-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/32aee1202c0a/sensors-16-01406-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/45cadf5096a4/sensors-16-01406-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/36a2c155bc5b/sensors-16-01406-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/33ddf02553e5/sensors-16-01406-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/b5325f03050e/sensors-16-01406-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/10a46ba789bc/sensors-16-01406-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/b4ff0b9af111/sensors-16-01406-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/3a26920b123d/sensors-16-01406-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/36350832f809/sensors-16-01406-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/b0c006a19b4c/sensors-16-01406-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/83a26588491c/sensors-16-01406-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/75eb4e5234ca/sensors-16-01406-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/296ab24cc781/sensors-16-01406-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c55a/5038684/f5155af0f633/sensors-16-01406-g024.jpg

相似文献

1
Onboard Robust Visual Tracking for UAVs Using a Reliable Global-Local Object Model.基于可靠全局-局部目标模型的无人机机载鲁棒视觉跟踪
Sensors (Basel). 2016 Aug 31;16(9):1406. doi: 10.3390/s16091406.
2
Learning channel-selective and aberrance repressed correlation filter with memory model for unmanned aerial vehicle object tracking.基于记忆模型的无人机目标跟踪的通道选择与异常抑制相关滤波器学习
Front Neurosci. 2023 Jan 10;16:1080521. doi: 10.3389/fnins.2022.1080521. eCollection 2022.
3
Unmanned Aerial Vehicle Object Tracking by Correlation Filter with Adaptive Appearance Model.基于自适应外观模型的相关滤波的无人机目标跟踪。
Sensors (Basel). 2018 Aug 21;18(9):2751. doi: 10.3390/s18092751.
4
Robust UAV-based Tracking Using Hybrid Classifiers.基于混合分类器的鲁棒无人机跟踪
Mach Vis Appl. 2019 Feb;30(1):125-137. doi: 10.1007/s00138-018-0981-4. Epub 2018 Sep 29.
5
Dynamic Object Tracking on Autonomous UAV System for Surveillance Applications.自主无人机系统上的动态目标跟踪用于监控应用。
Sensors (Basel). 2021 Nov 27;21(23):7888. doi: 10.3390/s21237888.
6
Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor.基于远程标记的无人机可见光相机传感器着陆跟踪
Sensors (Basel). 2017 Aug 30;17(9):1987. doi: 10.3390/s17091987.
7
Robust Visual Tracking with Reliable Object Information and Kalman Filter.基于可靠目标信息和卡尔曼滤波器的鲁棒视觉跟踪
Sensors (Basel). 2021 Jan 28;21(3):889. doi: 10.3390/s21030889.
8
Autonomous Vision-Based Aerial Grasping for Rotorcraft Unmanned Aerial Vehicles.基于视觉的旋翼机无人机自主空中抓取
Sensors (Basel). 2019 Aug 3;19(15):3410. doi: 10.3390/s19153410.
9
Object Tracking Using Local Multiple Features and a Posterior Probability Measure.基于局部多特征和后验概率测度的目标跟踪
Sensors (Basel). 2017 Mar 31;17(4):739. doi: 10.3390/s17040739.
10
Learning Response-Consistent and Background-Suppressed Correlation Filters for Real-Time UAV Tracking.学习响应一致且背景抑制相关滤波器以实现实时无人机跟踪。
Sensors (Basel). 2023 Mar 9;23(6):2980. doi: 10.3390/s23062980.

引用本文的文献

1
Moving Object Tracking Based on Sparse Optical Flow with Moving Window and Target Estimator.基于带有移动窗口和目标估计器的稀疏光流的运动目标跟踪
Sensors (Basel). 2022 Apr 8;22(8):2878. doi: 10.3390/s22082878.
2
In-Flight Tests of Intruder Detection Vision System.空中入侵检测视觉系统测试。
Sensors (Basel). 2021 Nov 5;21(21):7360. doi: 10.3390/s21217360.
3
Real-Time Object Tracking via Adaptive Correlation Filters.基于自适应相关滤波器的实时目标跟踪

本文引用的文献

1
Towards an Autonomous Vision-Based Unmanned Aerial System against Wildlife Poachers.迈向基于视觉的自主无人机系统以对抗野生动物偷猎者。
Sensors (Basel). 2015 Dec 12;15(12):31362-91. doi: 10.3390/s151229861.
2
High-Speed Tracking with Kernelized Correlation Filters.基于核相关滤波器的高速跟踪。
IEEE Trans Pattern Anal Mach Intell. 2015 Mar;37(3):583-96. doi: 10.1109/TPAMI.2014.2345390.
3
Tracking-Learning-Detection.跟踪-学习-检测。
Sensors (Basel). 2020 Jul 24;20(15):4124. doi: 10.3390/s20154124.
4
Vision-Based Multirotor Following Using Synthetic Learning Techniques.基于视觉的多旋翼飞行器跟随使用合成学习技术。
Sensors (Basel). 2019 Nov 4;19(21):4794. doi: 10.3390/s19214794.
5
UAV Sensor Fault Detection Using a Classifier without Negative Samples: A Local Density Regulated Optimization Algorithm.使用无负样本分类器的无人机传感器故障检测:局部密度调节优化算法。
Sensors (Basel). 2019 Feb 13;19(4):771. doi: 10.3390/s19040771.
6
An Improved Unauthorized Unmanned Aerial Vehicle Detection Algorithm Using Radiofrequency-Based Statistical Fingerprint Analysis.基于射频统计指纹分析的改进型非法无人机检测算法。
Sensors (Basel). 2019 Jan 11;19(2):274. doi: 10.3390/s19020274.
7
A Vision-Based Approach to UAV Detection and Tracking in Cooperative Applications.基于视觉的无人机检测与跟踪在合作应用中的方法。
Sensors (Basel). 2018 Oct 10;18(10):3391. doi: 10.3390/s18103391.
8
Unmanned Aerial Vehicle Object Tracking by Correlation Filter with Adaptive Appearance Model.基于自适应外观模型的相关滤波的无人机目标跟踪。
Sensors (Basel). 2018 Aug 21;18(9):2751. doi: 10.3390/s18092751.
9
Remote Marker-Based Tracking for UAV Landing Using Visible-Light Camera Sensor.基于远程标记的无人机可见光相机传感器着陆跟踪
Sensors (Basel). 2017 Aug 30;17(9):1987. doi: 10.3390/s17091987.
10
Real-Time Multi-Target Localization from Unmanned Aerial Vehicles.无人机实时多目标定位
Sensors (Basel). 2016 Dec 25;17(1):33. doi: 10.3390/s17010033.
IEEE Trans Pattern Anal Mach Intell. 2012 Jul;34(7):1409-22. doi: 10.1109/TPAMI.2011.239. Epub 2011 Dec 13.