• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

格雷克:一种基于图的平面目标跟踪器。

Gracker: A Graph-Based Planar Object Tracker.

出版信息

IEEE Trans Pattern Anal Mach Intell. 2018 Jun;40(6):1494-1501. doi: 10.1109/TPAMI.2017.2716350. Epub 2017 Jun 16.

DOI:10.1109/TPAMI.2017.2716350
PMID:28641246
Abstract

Matching-based algorithms have been commonly used in planar object tracking. They often model a planar object as a set of keypoints, and then find correspondences between keypoint sets via descriptor matching. In previous work, unary constraints on appearances or locations are usually used to guide the matching. However, these approaches rarely utilize structure information of the object, and are thus suffering from various perturbation factors. In this paper, we proposed a graph-based tracker, named Gracker, which is able to fully explore the structure information of the object to enhance tracking performance. We model a planar object as a graph, instead of a simple collection of keypoints, to represent its structure. Then, we reformulate tracking as a sequential graph matching process, which establishes keypoint correspondence in a geometric graph matching manner. For evaluation, we compare the proposed Gracker with state-of-the-art planar object trackers on three benchmark datasets: two public ones and a newly collected one. Experimental results show that Gracker achieves robust tracking results against various environmental variations, and outperforms other algorithms in general on the datasets.

摘要

基于匹配的算法已被广泛应用于平面目标跟踪。它们通常将平面目标建模为一组关键点,然后通过描述符匹配来寻找关键点集之间的对应关系。在以前的工作中,通常使用关于外观或位置的一元约束来指导匹配。然而,这些方法很少利用目标的结构信息,因此受到各种干扰因素的影响。在本文中,我们提出了一种基于图的跟踪器,称为 Gracker,它能够充分挖掘目标的结构信息,从而提高跟踪性能。我们将平面目标建模为一个图,而不是一个简单的关键点集合,以表示其结构。然后,我们将跟踪重新表述为一个连续的图匹配过程,以几何图匹配的方式建立关键点对应关系。为了评估,我们将所提出的 Gracker 与三个基准数据集上的最先进的平面目标跟踪器进行了比较:两个公共数据集和一个新收集的数据集。实验结果表明,Gracker 能够在各种环境变化下实现稳健的跟踪结果,并且在一般情况下优于其他算法。

相似文献

1
Gracker: A Graph-Based Planar Object Tracker.格雷克:一种基于图的平面目标跟踪器。
IEEE Trans Pattern Anal Mach Intell. 2018 Jun;40(6):1494-1501. doi: 10.1109/TPAMI.2017.2716350. Epub 2017 Jun 16.
2
Robust Object Tracking With Discrete Graph-Based Multiple Experts.基于离散图的多专家鲁棒目标跟踪。
IEEE Trans Image Process. 2017 Jun;26(6):2736-2750. doi: 10.1109/TIP.2017.2686601. Epub 2017 Mar 23.
3
Robust deformable and occluded object tracking with dynamic graph.基于动态图的鲁棒可变形和遮挡目标跟踪。
IEEE Trans Image Process. 2014 Dec;23(12):5497-509. doi: 10.1109/TIP.2014.2364919.
4
Multi-Task Structure-Aware Context Modeling for Robust Keypoint-Based Object Tracking.用于基于关键点的稳健目标跟踪的多任务结构感知上下文建模
IEEE Trans Pattern Anal Mach Intell. 2019 Apr;41(4):915-927. doi: 10.1109/TPAMI.2018.2818132. Epub 2018 Mar 22.
5
Onboard Robust Visual Tracking for UAVs Using a Reliable Global-Local Object Model.基于可靠全局-局部目标模型的无人机机载鲁棒视觉跟踪
Sensors (Basel). 2016 Aug 31;16(9):1406. doi: 10.3390/s16091406.
6
Log-Spiral Keypoint: A Robust Approach toward Image Patch Matching.对数螺旋关键点:一种用于图像块匹配的稳健方法。
Comput Intell Neurosci. 2015;2015:457495. doi: 10.1155/2015/457495. Epub 2015 May 5.
7
Correlation-Based Tracker-Level Fusion for Robust Visual Tracking.基于相关的跟踪器级融合用于鲁棒视觉跟踪。
IEEE Trans Image Process. 2017 Oct;26(10):4832-4842. doi: 10.1109/TIP.2017.2699791. Epub 2017 Apr 28.
8
Interacting Multiview Tracker.交互多视图跟踪器。
IEEE Trans Pattern Anal Mach Intell. 2016 May;38(5):903-17. doi: 10.1109/TPAMI.2015.2473862. Epub 2015 Aug 27.
9
Visual Tracking via Dynamic Graph Learning.基于动态图学习的视觉跟踪
IEEE Trans Pattern Anal Mach Intell. 2019 Nov;41(11):2770-2782. doi: 10.1109/TPAMI.2018.2864965. Epub 2018 Aug 13.
10
Fast ORB-SLAM Without Keypoint Descriptors.无需关键点描述符的快速ORB-SLAM
IEEE Trans Image Process. 2022;31:1433-1446. doi: 10.1109/TIP.2021.3136710. Epub 2022 Feb 3.