• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于多角度分析协作的自适应目标跟踪。

Adaptive Object Tracking via Multi-Angle Analysis Collaboration.

机构信息

School of Computer Science and Technology, Tianjin University, Tianjin 300350, China.

School of Computer Software, Tianjin University, Tianjin 300350, China.

出版信息

Sensors (Basel). 2018 Oct 24;18(11):3606. doi: 10.3390/s18113606.

DOI:10.3390/s18113606
PMID:30355977
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6264108/
Abstract

Although tracking research has achieved excellent performance in mathematical angles, it is still meaningful to analyze tracking problems from multiple perspectives. This motivation not only promotes the independence of tracking research but also increases the flexibility of practical applications. This paper presents a significant tracking framework based on the multi-dimensional state⁻action space reinforcement learning, termed as multi-angle analysis collaboration tracking (MACT). MACT is comprised of a basic tracking framework and a strategic framework which assists the former. Especially, the strategic framework is extensible and currently includes feature selection strategy (FSS) and movement trend strategy (MTS). These strategies are abstracted from the multi-angle analysis of tracking problems (observer's attention and object's motion). The content of the analysis corresponds to the specific actions in the multidimensional action space. Concretely, the tracker, regarded as an agent, is trained with -learning algorithm and ϵ -greedy exploration strategy, where we adopt a customized rewarding function to encourage robust object tracking. Numerous contrast experimental evaluations on the OTB50 benchmark demonstrate the effectiveness of the strategies and improvement in speed and accuracy of MACT tracker.

摘要

尽管跟踪研究在数学角度上取得了优异的性能,但从多个角度分析跟踪问题仍然具有意义。这种动机不仅促进了跟踪研究的独立性,还增加了实际应用的灵活性。本文提出了一种基于多维状态-动作空间强化学习的重要跟踪框架,称为多角度分析协作跟踪(MACT)。MACT 由一个基本跟踪框架和一个辅助前者的战略框架组成。特别是,战略框架是可扩展的,目前包括特征选择策略(FSS)和运动趋势策略(MTS)。这些策略是从跟踪问题的多角度分析(观察者的注意力和物体的运动)中抽象出来的。分析的内容对应于多维动作空间中的具体动作。具体来说,跟踪器被视为一个代理,使用 -学习算法和 ϵ -贪婪探索策略进行训练,其中我们采用定制的奖励函数来鼓励鲁棒的目标跟踪。在 OTB50 基准上进行的大量对比实验评估证明了这些策略的有效性,以及 MACT 跟踪器在速度和准确性方面的改进。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/3037e16b889b/sensors-18-03606-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/862cc3985b96/sensors-18-03606-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/065a5b76d38e/sensors-18-03606-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/add22c16a297/sensors-18-03606-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/03ca05ee5239/sensors-18-03606-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/a9e0a3a7c53f/sensors-18-03606-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/ff39f5ac341d/sensors-18-03606-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/3201a803c533/sensors-18-03606-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/a88e2a3e5d6b/sensors-18-03606-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/a32837c68af0/sensors-18-03606-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/3fefb4e34c77/sensors-18-03606-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/d2bb28c6d654/sensors-18-03606-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/317d61188e40/sensors-18-03606-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/dcbe769dbbc0/sensors-18-03606-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/e45a7ffac042/sensors-18-03606-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/a6f9e76e27b0/sensors-18-03606-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/7c0953fea083/sensors-18-03606-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/14ba3155514d/sensors-18-03606-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/3037e16b889b/sensors-18-03606-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/862cc3985b96/sensors-18-03606-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/065a5b76d38e/sensors-18-03606-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/add22c16a297/sensors-18-03606-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/03ca05ee5239/sensors-18-03606-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/a9e0a3a7c53f/sensors-18-03606-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/ff39f5ac341d/sensors-18-03606-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/3201a803c533/sensors-18-03606-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/a88e2a3e5d6b/sensors-18-03606-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/a32837c68af0/sensors-18-03606-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/3fefb4e34c77/sensors-18-03606-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/d2bb28c6d654/sensors-18-03606-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/317d61188e40/sensors-18-03606-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/dcbe769dbbc0/sensors-18-03606-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/e45a7ffac042/sensors-18-03606-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/a6f9e76e27b0/sensors-18-03606-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/7c0953fea083/sensors-18-03606-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/14ba3155514d/sensors-18-03606-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7809/6264108/3037e16b889b/sensors-18-03606-g018.jpg

相似文献

1
Adaptive Object Tracking via Multi-Angle Analysis Collaboration.基于多角度分析协作的自适应目标跟踪。
Sensors (Basel). 2018 Oct 24;18(11):3606. doi: 10.3390/s18113606.
2
Beyond Greedy Search: Tracking by Multi-Agent Reinforcement Learning-Based Beam Search.超越贪婪搜索:基于多智能体强化学习的束搜索跟踪
IEEE Trans Image Process. 2022;31:6239-6254. doi: 10.1109/TIP.2022.3208437. Epub 2022 Sep 30.
3
Learning Adaptive Discriminative Correlation Filters via Temporal Consistency Preserving Spatial Feature Selection for Robust Visual Object Tracking.通过保持时间一致性的空间特征选择学习自适应判别相关滤波器用于鲁棒视觉目标跟踪
IEEE Trans Image Process. 2019 Nov;28(11):5596-5609. doi: 10.1109/TIP.2019.2919201. Epub 2019 Jun 3.
4
Interacting Multiview Tracker.交互多视图跟踪器。
IEEE Trans Pattern Anal Mach Intell. 2016 May;38(5):903-17. doi: 10.1109/TPAMI.2015.2473862. Epub 2015 Aug 27.
5
Effective Visual Tracking Using Multi-Block and Scale Space Based on Kernelized Correlation Filters.基于核相关滤波器的多块与尺度空间有效视觉跟踪
Sensors (Basel). 2017 Feb 23;17(3):433. doi: 10.3390/s17030433.
6
Multi-Complementary Model for Long-Term Tracking.用于长期跟踪的多互补模型
Sensors (Basel). 2018 Feb 9;18(2):527. doi: 10.3390/s18020527.
7
Robust Object Tracking With Discrete Graph-Based Multiple Experts.基于离散图的多专家鲁棒目标跟踪。
IEEE Trans Image Process. 2017 Jun;26(6):2736-2750. doi: 10.1109/TIP.2017.2686601. Epub 2017 Mar 23.
8
SAFS: Object Tracking Algorithm Based on Self-Adaptive Feature Selection.SAFS:基于自适应特征选择的目标跟踪算法
Sensors (Basel). 2021 Jun 11;21(12):4030. doi: 10.3390/s21124030.
9
Correlation-Based Tracker-Level Fusion for Robust Visual Tracking.基于相关的跟踪器级融合用于鲁棒视觉跟踪。
IEEE Trans Image Process. 2017 Oct;26(10):4832-4842. doi: 10.1109/TIP.2017.2699791. Epub 2017 Apr 28.
10
Unified Graph-Based Multicue Feature Fusion for Robust Visual Tracking.基于统一图的多线索特征融合的鲁棒视觉跟踪
IEEE Trans Cybern. 2020 Jun;50(6):2357-2368. doi: 10.1109/TCYB.2019.2920289. Epub 2019 Jun 25.

引用本文的文献

1
The White Matter Functional Abnormalities in Patients with Transient Ischemic Attack: A Reinforcement Learning Approach.短暂性脑缺血发作患者的脑白质功能异常:强化学习方法。
Neural Plast. 2022 Oct 17;2022:1478048. doi: 10.1155/2022/1478048. eCollection 2022.

本文引用的文献

1
NUS-PRO: A New Visual Tracking Challenge.NUS-PRO:一个新的视觉跟踪挑战。
IEEE Trans Pattern Anal Mach Intell. 2016 Feb;38(2):335-49. doi: 10.1109/TPAMI.2015.2417577.
2
Visual Tracking: An Experimental Survey.视觉跟踪:实验综述。
IEEE Trans Pattern Anal Mach Intell. 2014 Jul;36(7):1442-68. doi: 10.1109/TPAMI.2013.230.
3
High-Speed Tracking with Kernelized Correlation Filters.基于核相关滤波器的高速跟踪。
IEEE Trans Pattern Anal Mach Intell. 2015 Mar;37(3):583-96. doi: 10.1109/TPAMI.2014.2345390.
4
Human-level control through deep reinforcement learning.通过深度强化学习实现人类水平的控制。
Nature. 2015 Feb 26;518(7540):529-33. doi: 10.1038/nature14236.
5
Tracking-Learning-Detection.跟踪-学习-检测。
IEEE Trans Pattern Anal Mach Intell. 2012 Jul;34(7):1409-22. doi: 10.1109/TPAMI.2011.239. Epub 2011 Dec 13.
6
What attributes guide the deployment of visual attention and how do they do it?哪些属性引导视觉注意力的部署,以及它们是如何做到的?
Nat Rev Neurosci. 2004 Jun;5(6):495-501. doi: 10.1038/nrn1411.