• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于运动一致性的鲁棒目标跟踪

Robust Object Tracking Based on Motion Consistency.

作者信息

He Lijun, Qiao Xiaoya, Wen Shuai, Li Fan

机构信息

Department of Information and Communication Engineering, School of Electronic and Information Engineering, Xi'an Jiaotong University, Xi'an 710049, China.

出版信息

Sensors (Basel). 2018 Feb 13;18(2):572. doi: 10.3390/s18020572.

DOI:10.3390/s18020572
PMID:29438323
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5854992/
Abstract

Object tracking is an important research direction in computer vision and is widely used in video surveillance, security monitoring, video analysis and other fields. Conventional tracking algorithms perform poorly in specific scenes, such as a target with fast motion and occlusion. The candidate samples may lose the true target due to its fast motion. Moreover, the appearance of the target may change with movement. In this paper, we propose an object tracking algorithm based on motion consistency. In the state transition model, candidate samples are obtained by the target state, which is predicted according to the temporal correlation. In the appearance model, we define the position factor to represent the different importance of candidate samples in different positions using the double Gaussian probability model. The candidate sample with highest likelihood is selected as the tracking result by combining the holistic and local responses with the position factor. Moreover, an adaptive template updating scheme is proposed to adapt to the target's appearance changes, especially those caused by fast motion. The experimental results on a 2013 benchmark dataset demonstrate that the proposed algorithm performs better in scenes with fast motion and partial or full occlusion compared to the state-of-the-art algorithms.

摘要

目标跟踪是计算机视觉中的一个重要研究方向,广泛应用于视频监控、安全监测、视频分析等领域。传统的跟踪算法在特定场景下表现不佳,比如目标快速运动和被遮挡的场景。候选样本可能会因其快速运动而丢失真实目标。此外,目标的外观可能会随着运动而改变。在本文中,我们提出了一种基于运动一致性的目标跟踪算法。在状态转移模型中,候选样本通过目标状态获得,该目标状态是根据时间相关性预测得到的。在外观模型中,我们使用双高斯概率模型定义位置因子,以表示候选样本在不同位置的不同重要性。通过将整体和局部响应与位置因子相结合,选择可能性最高的候选样本作为跟踪结果。此外,还提出了一种自适应模板更新方案,以适应目标的外观变化,尤其是由快速运动引起的变化。在2013年基准数据集上的实验结果表明,与现有最先进算法相比,所提出的算法在快速运动以及部分或完全遮挡的场景中表现更好。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/e6fc16ea5832/sensors-18-00572-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/61951cb79124/sensors-18-00572-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/63240c94d919/sensors-18-00572-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/700bf0f69db1/sensors-18-00572-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/20ef5197f7c0/sensors-18-00572-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/b2a472749838/sensors-18-00572-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/25f9d0e69141/sensors-18-00572-g006a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/2c44d60f364c/sensors-18-00572-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/5f925295fe37/sensors-18-00572-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/8da84a5b7cd1/sensors-18-00572-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/78563dcdad19/sensors-18-00572-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/fd6c932d104f/sensors-18-00572-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/a8f6d30598fd/sensors-18-00572-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/e07cd5d6f8dd/sensors-18-00572-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/a873c177a8da/sensors-18-00572-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/8bf131d30d23/sensors-18-00572-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/7a114f121a96/sensors-18-00572-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/c08c805e8306/sensors-18-00572-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/e6fc16ea5832/sensors-18-00572-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/61951cb79124/sensors-18-00572-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/63240c94d919/sensors-18-00572-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/700bf0f69db1/sensors-18-00572-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/20ef5197f7c0/sensors-18-00572-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/b2a472749838/sensors-18-00572-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/25f9d0e69141/sensors-18-00572-g006a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/2c44d60f364c/sensors-18-00572-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/5f925295fe37/sensors-18-00572-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/8da84a5b7cd1/sensors-18-00572-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/78563dcdad19/sensors-18-00572-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/fd6c932d104f/sensors-18-00572-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/a8f6d30598fd/sensors-18-00572-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/e07cd5d6f8dd/sensors-18-00572-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/a873c177a8da/sensors-18-00572-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/8bf131d30d23/sensors-18-00572-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/7a114f121a96/sensors-18-00572-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/c08c805e8306/sensors-18-00572-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/96ed/5854992/e6fc16ea5832/sensors-18-00572-g018.jpg

相似文献

1
Robust Object Tracking Based on Motion Consistency.基于运动一致性的鲁棒目标跟踪
Sensors (Basel). 2018 Feb 13;18(2):572. doi: 10.3390/s18020572.
2
Scene-Aware Adaptive Updating for Visual Tracking via Correlation Filters.基于相关滤波器的视觉跟踪场景感知自适应更新
Sensors (Basel). 2017 Nov 15;17(11):2626. doi: 10.3390/s17112626.
3
Robust Fusion of Color and Depth Data for RGB-D Target Tracking Using Adaptive Range-Invariant Depth Models and Spatio-Temporal Consistency Constraints.基于自适应范围不变深度模型和时空一致性约束的 RGB-D 目标跟踪的彩色和深度数据的鲁棒融合。
IEEE Trans Cybern. 2018 Aug;48(8):2485-2499. doi: 10.1109/TCYB.2017.2740952. Epub 2017 Sep 6.
4
Occlusion-Aware Fragment-Based Tracking With Spatial-Temporal Consistency.基于时空一致性的遮挡感知分块跟踪。
IEEE Trans Image Process. 2016 Aug;25(8):3814-25. doi: 10.1109/TIP.2016.2580463. Epub 2016 Jun 13.
5
Motion-Aware Correlation Filters for Online Visual Tracking.运动感知相关滤波器的在线视觉跟踪。
Sensors (Basel). 2018 Nov 14;18(11):3937. doi: 10.3390/s18113937.
6
Visual Tracking via Coarse and Fine Structural Local Sparse Appearance Models.基于粗粒度和细粒度结构局部稀疏表观模型的视觉跟踪。
IEEE Trans Image Process. 2016 Oct;25(10):4555-64. doi: 10.1109/TIP.2016.2592701. Epub 2016 Jul 18.
7
Patchwise joint sparse tracking with occlusion detection.基于遮挡检测的逐块联合稀疏跟踪。
IEEE Trans Image Process. 2014 Oct;23(10):4496-510. doi: 10.1109/TIP.2014.2346029. Epub 2014 Aug 7.
8
Enhancement of ELDA Tracker Based on CNN Features and Adaptive Model Update.基于卷积神经网络特征和自适应模型更新的ELDA跟踪器增强
Sensors (Basel). 2016 Apr 15;16(4):545. doi: 10.3390/s16040545.
9
SAFS: Object Tracking Algorithm Based on Self-Adaptive Feature Selection.SAFS:基于自适应特征选择的目标跟踪算法
Sensors (Basel). 2021 Jun 11;21(12):4030. doi: 10.3390/s21124030.
10
Applying mean shift, motion information and Kalman filtering approaches to object tracking.应用均值漂移、运动信息和卡尔曼滤波方法进行目标跟踪。
ISA Trans. 2012 May;51(3):485-97. doi: 10.1016/j.isatra.2012.02.002. Epub 2012 Mar 10.

本文引用的文献

1
Scene-Aware Adaptive Updating for Visual Tracking via Correlation Filters.基于相关滤波器的视觉跟踪场景感知自适应更新
Sensors (Basel). 2017 Nov 15;17(11):2626. doi: 10.3390/s17112626.
2
Robust Object Tracking via Key Patch Sparse Representation.基于关键补丁稀疏表示的鲁棒目标跟踪。
IEEE Trans Cybern. 2017 Feb;47(2):354-364. doi: 10.1109/TCYB.2016.2514714. Epub 2016 Mar 11.
3
Struck: Structured Output Tracking with Kernels.击中:基于核的结构化输出跟踪。
IEEE Trans Pattern Anal Mach Intell. 2016 Oct;38(10):2096-109. doi: 10.1109/TPAMI.2015.2509974. Epub 2015 Dec 17.
4
Robust object tracking via sparse collaborative appearance model.基于稀疏协同表观模型的鲁棒目标跟踪。
IEEE Trans Image Process. 2014 May;23(5):2356-68. doi: 10.1109/TIP.2014.2313227.
5
Robust visual tracking and vehicle classification via sparse representation.基于稀疏表示的鲁棒视觉跟踪与车辆分类。
IEEE Trans Pattern Anal Mach Intell. 2011 Nov;33(11):2259-72. doi: 10.1109/TPAMI.2011.66.
6
Robust Object Tracking with Online Multiple Instance Learning.基于在线多示例学习的鲁棒目标跟踪。
IEEE Trans Pattern Anal Mach Intell. 2011 Aug;33(8):1619-32. doi: 10.1109/TPAMI.2010.226. Epub 2010 Dec 23.
7
Adaptive mean-shift tracking with auxiliary particles.带辅助粒子的自适应均值漂移跟踪
IEEE Trans Syst Man Cybern B Cybern. 2009 Dec;39(6):1578-89. doi: 10.1109/TSMCB.2009.2021482. Epub 2009 Jun 19.
8
Multisensor-based human detection and tracking for mobile service robots.用于移动服务机器人的基于多传感器的人体检测与跟踪
IEEE Trans Syst Man Cybern B Cybern. 2009 Feb;39(1):167-81. doi: 10.1109/TSMCB.2008.2004050. Epub 2008 Dec 9.