IEEE Trans Neural Netw Learn Syst. 2015 Aug;26(8):1710-20. doi: 10.1109/TNNLS.2014.2352401. Epub 2014 Sep 16.
This paper presents a number of new methods for visual tracking using the output of an event-based asynchronous neuromorphic dynamic vision sensor. It allows the tracking of multiple visual features in real time, achieving an update rate of several hundred kilohertz on a standard desktop PC. The approach has been specially adapted to take advantage of the event-driven properties of these sensors by combining both spatial and temporal correlations of events in an asynchronous iterative framework. Various kernels, such as Gaussian, Gabor, combinations of Gabor functions, and arbitrary user-defined kernels, are used to track features from incoming events. The trackers described in this paper are capable of handling variations in position, scale, and orientation through the use of multiple pools of trackers. This approach avoids the N(2) operations per event associated with conventional kernel-based convolution operations with N × N kernels. The tracking performance was evaluated experimentally for each type of kernel in order to demonstrate the robustness of the proposed solution.
本文提出了一种使用基于事件的异步神经形态动态视觉传感器输出进行视觉跟踪的新方法。它可以实时跟踪多个视觉特征,在标准台式 PC 上实现几百千赫兹的更新率。该方法特别适用于利用这些传感器的事件驱动特性,通过在异步迭代框架中组合事件的空间和时间相关性来实现。各种核函数,如高斯核、Gabor 核、Gabor 函数的组合以及任意用户定义的核函数,都被用于从传入事件中跟踪特征。本文描述的跟踪器能够通过使用多个跟踪器池来处理位置、尺度和方向的变化。这种方法避免了与使用 N×N 核的传统基于核的卷积操作相关的每个事件 N(2)次运算。为了展示所提出解决方案的鲁棒性,对每种核函数的跟踪性能进行了实验评估。