Fu Changhong, Duan Ran, Kircali Dogan, Kayacan Erdal
School of Mechanical and Aerospace Engineering, Nanyang Technological University (NTU), 50 Nanyang Avenue, Singapore 639798, Singapore.
ST Engineering-NTU Corporate Laboratory, Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798, Singapore.
Sensors (Basel). 2016 Aug 31;16(9):1406. doi: 10.3390/s16091406.
In this paper, we present a novel onboard robust visual algorithm for long-term arbitrary 2D and 3D object tracking using a reliable global-local object model for unmanned aerial vehicle (UAV) applications, e.g., autonomous tracking and chasing a moving target. The first main approach in this novel algorithm is the use of a global matching and local tracking approach. In other words, the algorithm initially finds feature correspondences in a way that an improved binary descriptor is developed for global feature matching and an iterative Lucas-Kanade optical flow algorithm is employed for local feature tracking. The second main module is the use of an efficient local geometric filter (LGF), which handles outlier feature correspondences based on a new forward-backward pairwise dissimilarity measure, thereby maintaining pairwise geometric consistency. In the proposed LGF module, a hierarchical agglomerative clustering, i.e., bottom-up aggregation, is applied using an effective single-link method. The third proposed module is a heuristic local outlier factor (to the best of our knowledge, it is utilized for the first time to deal with outlier features in a visual tracking application), which further maximizes the representation of the target object in which we formulate outlier feature detection as a binary classification problem with the output features of the LGF module. Extensive UAV flight experiments show that the proposed visual tracker achieves real-time frame rates of more than thirty-five frames per second on an i7 processor with 640 × 512 image resolution and outperforms the most popular state-of-the-art trackers favorably in terms of robustness, efficiency and accuracy.
在本文中,我们提出了一种新颖的机载鲁棒视觉算法,用于使用可靠的全局-局部目标模型进行长期任意二维和三维目标跟踪,适用于无人机(UAV)应用,例如自主跟踪和追逐移动目标。这种新颖算法的第一个主要方法是使用全局匹配和局部跟踪方法。换句话说,该算法最初通过开发一种改进的二进制描述符进行全局特征匹配,并采用迭代卢卡斯-卡纳德光流算法进行局部特征跟踪来找到特征对应关系。第二个主要模块是使用高效的局部几何滤波器(LGF),它基于一种新的前后成对差异度量来处理异常特征对应关系,从而保持成对几何一致性。在所提出的LGF模块中,使用有效的单链方法应用层次凝聚聚类,即自底向上聚合。第三个提出的模块是启发式局部异常因子(据我们所知,它首次用于视觉跟踪应用中处理异常特征),它进一步最大化目标对象的表示,其中我们将异常特征检测表述为一个基于LGF模块输出特征的二进制分类问题。广泛的无人机飞行实验表明,所提出的视觉跟踪器在具有(640×512)图像分辨率的i7处理器上实现了每秒超过三十五帧的实时帧率,并且在鲁棒性、效率和准确性方面优于最流行的现有跟踪器。