• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

眼动追踪数据的离群值纠错模式。

Mode-of-disparities error correction of eye-tracking data.

机构信息

Department of Computer and Information Science, University of Oregon, 1202 University of Oregon, Eugene, Oregon 97403-1202, USA.

出版信息

Behav Res Methods. 2011 Sep;43(3):834-42. doi: 10.3758/s13428-011-0073-0.

DOI:10.3758/s13428-011-0073-0
PMID:21487905
Abstract

In eye-tracking research, there is almost always a disparity between a person's actual gaze location and the location recorded by the eye tracker. Disparities that are constant over time are systematic error. In this article, we propose an error correction method that can reliably reduce systematic error and restore fixations to their true locations. We show that the method is reliable when the visual objects in the experiment are arranged in an irregular manner-for example, when they are not on a grid in which all fixations can be shifted to adjacent locations using the same directional adjustment. The method first calculates the disparities between fixations and their nearest objects. It then uses the annealed mean shift algorithm to find the mode of the disparities. The mode is demonstrated to correctly capture the magnitude and direction of the systematic error so that it can be removed. This article presents the method, an extended demonstration, and a validation of the method's efficacy.

摘要

在眼动追踪研究中,被试者的实际注视位置与眼动追踪仪记录的位置之间几乎总是存在差异。随着时间的推移而保持不变的差异是系统误差。在本文中,我们提出了一种错误纠正方法,可以可靠地减少系统误差并将注视点恢复到真实位置。我们表明,当实验中的视觉对象以不规则的方式排列时,该方法是可靠的,例如,当它们不在网格中时,所有注视点都可以使用相同的方向调整移至相邻位置。该方法首先计算注视点与其最近物体之间的差异。然后,它使用退火均值漂移算法找到差异的模式。该模式可以正确捕获系统误差的大小和方向,从而可以消除该误差。本文介绍了该方法、扩展演示以及该方法有效性的验证。

相似文献

1
Mode-of-disparities error correction of eye-tracking data.眼动追踪数据的离群值纠错模式。
Behav Res Methods. 2011 Sep;43(3):834-42. doi: 10.3758/s13428-011-0073-0.
2
A data-driven algorithm for offline pupil signal preprocessing and eyeblink detection in low-speed eye-tracking protocols.一种用于低速眼动跟踪协议中离线瞳孔信号预处理和眨眼检测的数据驱动算法。
Behav Res Methods. 2011 Jun;43(2):372-83. doi: 10.3758/s13428-010-0055-7.
3
Head movement compensation and multi-modal event detection in eye-tracking data for unconstrained head movements.用于无约束头部运动的眼动追踪数据中的头部运动补偿和多模态事件检测。
J Neurosci Methods. 2016 Dec 1;274:13-26. doi: 10.1016/j.jneumeth.2016.09.005. Epub 2016 Sep 28.
4
A brain-computer interface method combined with eye tracking for 3D interaction.脑机接口方法结合眼动追踪的三维交互
J Neurosci Methods. 2010 Jul 15;190(2):289-98. doi: 10.1016/j.jneumeth.2010.05.008. Epub 2010 May 16.
5
Exploiting human sensitivity to gaze for tracking the eyes.利用人类对注视的敏感性来跟踪眼睛。
Behav Res Methods. 2011 Sep;43(3):843-52. doi: 10.3758/s13428-011-0078-8.
6
Assessment of visual orienting behaviour in young children using remote eye tracking: methodology and reliability.使用远程眼动追踪评估幼儿的视觉定向行为:方法和可靠性。
J Neurosci Methods. 2010 Jun 15;189(2):252-6. doi: 10.1016/j.jneumeth.2010.04.005. Epub 2010 Apr 13.
7
Anchoring gaze when categorizing faces' sex: evidence from eye-tracking data.对面孔性别进行分类时的注视锚定:来自眼动追踪数据的证据。
Vision Res. 2009 Nov;49(23):2870-80. doi: 10.1016/j.visres.2009.09.001. Epub 2009 Sep 4.
8
iMap: a novel method for statistical fixation mapping of eye movement data.iMap:一种用于眼动数据统计固视制图的新方法。
Behav Res Methods. 2011 Sep;43(3):864-78. doi: 10.3758/s13428-011-0092-x.
9
[Evaluation algorithm for eye movement patterns during a problem solving task].[解决问题任务期间眼动模式的评估算法]
Z Exp Angew Psychol. 1992;39(4):646-61.
10
Locations of serial reach targets are coded in multiple reference frames.连续伸展目标的位置在多个参考系中进行编码。
Vision Res. 2010 Dec;50(24):2651-60. doi: 10.1016/j.visres.2010.09.013. Epub 2010 Sep 17.

引用本文的文献

1
On the Validity and Benefit of Manual and Automated Drift Correction in Reading Tasks.阅读任务中手动和自动漂移校正的有效性及益处
J Eye Mov Res. 2025 May 9;18(3):17. doi: 10.3390/jemr18030017. eCollection 2025 Jun.
2
Gaze Error Estimation and Linear Transformation to Improve Accuracy of Video-Based Eye Trackers.注视误差估计与线性变换以提高基于视频的眼动仪的准确性。
Vision (Basel). 2025 Apr 3;9(2):29. doi: 10.3390/vision9020029.
3
Combining automation and expertise: A semi-automated approach to correcting eye-tracking data in reading tasks.
结合自动化与专业知识:一种用于校正阅读任务中眼动追踪数据的半自动化方法。
Behav Res Methods. 2025 Jan 24;57(2):72. doi: 10.3758/s13428-025-02597-3.
4
The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study.眼动追踪基础 第4部分:进行眼动追踪研究的工具。
Behav Res Methods. 2025 Jan 6;57(1):46. doi: 10.3758/s13428-024-02529-7.
5
Advancing Dynamic-Time Warp Techniques for Correcting Eye Tracking Data in Reading Source Code.推进动态时间规整技术以校正阅读源代码时的眼动追踪数据。
J Eye Mov Res. 2024 Mar 18;17(1). doi: 10.16910/jemr.17.1.4. eCollection 2024.
6
Disrupting Short-Term Memory Maintenance in Premotor Cortex Affects Serial Dependence in Visuomotor Integration.破坏运动前皮质的短期记忆维持会影响视觉运动整合中的序列相关性。
J Neurosci. 2021 Nov 10;41(45):9392-9402. doi: 10.1523/JNEUROSCI.0380-21.2021. Epub 2021 Oct 4.
7
An interactive eye-tracking system for measuring radiologists' visual fixations in volumetric CT images: Implementation and initial eye-tracking accuracy validation.一种用于测量容积 CT 图像中放射科医生视觉注视点的交互式眼动追踪系统:实现和初步眼动追踪准确性验证。
Med Phys. 2021 Nov;48(11):6710-6723. doi: 10.1002/mp.15219. Epub 2021 Oct 6.
8
Application of Eye Tracking Technology in Aviation, Maritime, and Construction Industries: A Systematic Review.眼动追踪技术在航空、航海和建筑行业的应用:系统评价。
Sensors (Basel). 2021 Jun 23;21(13):4289. doi: 10.3390/s21134289.
9
Algorithms for the automated correction of vertical drift in eye-tracking data.用于眼动追踪数据中垂直漂移自动校正的算法。
Behav Res Methods. 2022 Feb;54(1):287-310. doi: 10.3758/s13428-021-01554-0. Epub 2021 Jun 22.
10
Enhancing the usability of low-cost eye trackers for rehabilitation applications.提高低成本眼动追踪器在康复应用中的可用性。
PLoS One. 2018 Jun 1;13(6):e0196348. doi: 10.1371/journal.pone.0196348. eCollection 2018.