• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

在眼动追踪数据中,对注视位置信号进行特征化描述,并对注视期间的噪声进行合成。

Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data.

机构信息

Lund University Humanities Laboratory and Department of Psychology, Lund University, Lund, Sweden.

Šiauliai University, Šiauliai, Lithuania.

出版信息

Behav Res Methods. 2020 Dec;52(6):2515-2534. doi: 10.3758/s13428-020-01400-9.

DOI:10.3758/s13428-020-01400-9
PMID:32472501
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7725698/
Abstract

The magnitude of variation in the gaze position signals recorded by an eye tracker, also known as its precision, is an important aspect of an eye tracker's data quality. However, data quality of eye-tracking signals is still poorly understood. In this paper, we therefore investigate the following: (1) How do the various available measures characterizing eye-tracking data during fixation relate to each other? (2) How are they influenced by signal type? (3) What type of noise should be used to augment eye-tracking data when evaluating eye-movement analysis methods? To support our analysis, this paper presents new measures to characterize signal type and signal magnitude based on RMS-S2S and STD, two established measures of precision. Simulations are performed to investigate how each of these measures depends on the number of gaze position samples over which they are calculated, and to reveal how RMS-S2S and STD relate to each other and to measures characterizing the temporal spectrum composition of the recorded gaze position signal. Further empirical investigations were performed using gaze position data recorded with five eye trackers from human and artificial eyes. We found that although the examined eye trackers produce gaze position signals with different characteristics, the relations between precision measures derived from simulations are borne out by the data. We furthermore conclude that data with a range of signal type values should be used to assess the robustness of eye-movement analysis methods. We present a method for generating artificial eye-tracker noise of any signal type and magnitude.

摘要

眼动追踪器记录的注视位置信号的变化幅度(也称为精度)大小是眼动追踪器数据质量的一个重要方面。然而,眼动追踪信号的数据质量仍未得到很好的理解。因此,在本文中,我们研究了以下内容:(1)在注视期间,表征眼动追踪数据的各种可用测量值彼此之间如何相关?(2)它们如何受到信号类型的影响?(3)在评估眼动分析方法时,应该使用哪种类型的噪声来增强眼动追踪数据?为了支持我们的分析,本文提出了新的措施,基于 RMS-S2S 和 STD 这两个已确立的精度测量标准,来分别对信号类型和信号幅度进行特征描述。通过模拟来研究这些措施中的每一个是如何依赖于计算它们的注视位置样本数量的,以及揭示 RMS-S2S 和 STD 彼此之间以及与描述记录的注视位置信号的时间谱组成的测量值之间的关系。进一步通过使用来自人类和人工眼睛的五个眼动追踪器记录的注视位置数据进行了实证研究。我们发现,尽管所研究的眼动追踪器产生的注视位置信号具有不同的特征,但模拟得出的精度测量值之间的关系与数据相符。我们进一步得出结论,应该使用具有不同信号类型值的数据集来评估眼动分析方法的稳健性。我们提出了一种用于生成任何信号类型和幅度的人工眼动追踪器噪声的方法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/190085f976d8/13428_2020_1400_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/eb8812b6cc1b/13428_2020_1400_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/b9c29d0c98cf/13428_2020_1400_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/0fa115bf026e/13428_2020_1400_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/7e52af76d0fc/13428_2020_1400_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/65aee5d99a16/13428_2020_1400_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/55583edb659c/13428_2020_1400_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/4394268e5d11/13428_2020_1400_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/3de9b8b9afcc/13428_2020_1400_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/ed9dd694752b/13428_2020_1400_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/b2925c40a91f/13428_2020_1400_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/190085f976d8/13428_2020_1400_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/eb8812b6cc1b/13428_2020_1400_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/b9c29d0c98cf/13428_2020_1400_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/0fa115bf026e/13428_2020_1400_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/7e52af76d0fc/13428_2020_1400_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/65aee5d99a16/13428_2020_1400_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/55583edb659c/13428_2020_1400_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/4394268e5d11/13428_2020_1400_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/3de9b8b9afcc/13428_2020_1400_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/ed9dd694752b/13428_2020_1400_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/b2925c40a91f/13428_2020_1400_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/185b/7725698/190085f976d8/13428_2020_1400_Fig11_HTML.jpg

相似文献

1
Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data.在眼动追踪数据中,对注视位置信号进行特征化描述,并对注视期间的噪声进行合成。
Behav Res Methods. 2020 Dec;52(6):2515-2534. doi: 10.3758/s13428-020-01400-9.
2
Is apparent fixational drift in eye-tracking data due to filters or eyeball rotation?眼动追踪数据中的明显固视漂移是由于滤波器还是眼球旋转造成的?
Behav Res Methods. 2021 Feb;53(1):311-324. doi: 10.3758/s13428-020-01414-3.
3
Small eye movements cannot be reliably measured by video-based P-CR eye-trackers.基于视频的 P-CR 眼动追踪器无法可靠地测量小眼球运动。
Behav Res Methods. 2020 Oct;52(5):2098-2121. doi: 10.3758/s13428-020-01363-x.
4
Replacing eye trackers in ongoing studies: A comparison of eye-tracking data quality between the Tobii Pro TX300 and the Tobii Pro Spectrum.在正在进行的研究中替换眼动追踪器:Tobii Pro TX300 和 Tobii Pro Spectrum 之间的眼动追踪数据质量比较。
Infancy. 2022 Jan;27(1):25-45. doi: 10.1111/infa.12441. Epub 2021 Oct 22.
5
Small head movements increase and colour noise in data from five video-based P-CR eye trackers.来自 5 个基于视频的 P-CR 眼动追踪器的数据中,小的头部运动会增加并使颜色噪声。
Behav Res Methods. 2022 Apr;54(2):845-863. doi: 10.3758/s13428-021-01648-9. Epub 2021 Aug 6.
6
Development of Open-source Software and Gaze Data Repositories for Performance Evaluation of Eye Tracking Systems.用于眼动追踪系统性能评估的开源软件和注视数据存储库的开发。
Vision (Basel). 2019 Oct 22;3(4):55. doi: 10.3390/vision3040055.
7
What to expect from your remote eye-tracker when participants are unrestrained.当被试者不受约束时,您可以从远程眼动追踪器中获得哪些信息。
Behav Res Methods. 2018 Feb;50(1):213-227. doi: 10.3758/s13428-017-0863-0.
8
Performance Evaluation Strategies for Eye Gaze Estimation Systems with Quantitative Metrics and Visualizations.眼动追踪系统的性能评估策略:定量指标与可视化
Sensors (Basel). 2018 Sep 18;18(9):3151. doi: 10.3390/s18093151.
9
Outlier-Robust Gaze Signal Filtering Framework Based on Eye-Movement Modality Recognition and Set-Membership Approach.基于眼动模态识别和集合成员方法的鲁棒性凝视信号滤波框架。
IEEE Trans Biomed Eng. 2023 Aug;70(8):2463-2474. doi: 10.1109/TBME.2023.3249233. Epub 2023 Jul 18.
10
Pupil size dynamics during fixation impact the accuracy and precision of video-based gaze estimation.注视过程中瞳孔大小的动态变化会影响基于视频的注视估计的准确性和精确性。
Vision Res. 2016 Jan;118:48-59. doi: 10.1016/j.visres.2014.12.018. Epub 2015 Jan 9.

引用本文的文献

1
When is enough enough? Empirical guidelines to determine participant sample size for scene viewing studies.何时才足够?确定场景观看研究参与者样本量的实证指南。
Behav Res Methods. 2025 Jul 28;57(9):241. doi: 10.3758/s13428-025-02754-8.
2
Hilbert-Huang transform based pupil changes analysis for concentration assessment in skilled mowing.基于希尔伯特-黄变换的瞳孔变化分析在熟练割草中注意力集中程度评估的应用
Sci Rep. 2025 Jul 1;15(1):21862. doi: 10.1038/s41598-025-08203-y.
3
TittaLSL: A toolbox for creating networked eye-tracking experiments in Python and MATLAB with Tobii eye trackers.

本文引用的文献

1
Is apparent fixational drift in eye-tracking data due to filters or eyeball rotation?眼动追踪数据中的明显固视漂移是由于滤波器还是眼球旋转造成的?
Behav Res Methods. 2021 Feb;53(1):311-324. doi: 10.3758/s13428-020-01414-3.
2
Small eye movements cannot be reliably measured by video-based P-CR eye-trackers.基于视频的 P-CR 眼动追踪器无法可靠地测量小眼球运动。
Behav Res Methods. 2020 Oct;52(5):2098-2121. doi: 10.3758/s13428-020-01363-x.
3
The impact of slippage on the data quality of head-worn eye trackers.头戴式眼动仪中滑动对数据质量的影响。
TittaLSL:一个用于在Python和MATLAB中使用托比眼动仪创建联网眼动追踪实验的工具箱。
Behav Res Methods. 2025 Jun 4;57(7):190. doi: 10.3758/s13428-025-02714-2.
4
gazeMapper: A tool for automated world-based analysis of gaze data from one or multiple wearable eye trackers.凝视映射器:一种用于对来自一个或多个可穿戴式眼动仪的凝视数据进行基于世界的自动分析的工具。
Behav Res Methods. 2025 Jun 3;57(7):188. doi: 10.3758/s13428-025-02704-4.
5
LEyes: A lightweight framework for deep learning-based eye tracking using synthetic eye images.LEyes:一个使用合成眼睛图像进行基于深度学习的眼动追踪的轻量级框架。
Behav Res Methods. 2025 Mar 31;57(5):129. doi: 10.3758/s13428-025-02645-y.
6
Unstable foveation's impact on reading, object tracking, and its implications for diagnosing and intervening in reading difficulties.不稳定注视对阅读、物体追踪的影响及其对阅读困难诊断和干预的意义。
Sci Rep. 2025 Feb 24;15(1):6546. doi: 10.1038/s41598-024-83316-4.
7
The fundamentals of eye tracking part 3: How to choose an eye tracker.眼动追踪基础 第3部分:如何选择眼动仪。
Behav Res Methods. 2025 Jan 22;57(2):67. doi: 10.3758/s13428-024-02587-x.
8
The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study.眼动追踪基础 第4部分:进行眼动追踪研究的工具。
Behav Res Methods. 2025 Jan 6;57(1):46. doi: 10.3758/s13428-024-02529-7.
9
Eye tracker calibration: How well can humans refixate a target?眼动追踪校准:人类重新注视目标的能力有多强?
Behav Res Methods. 2024 Dec 19;57(1):23. doi: 10.3758/s13428-024-02564-4.
10
Enhancing eye tracking for nonhuman primates and other subjects unable to follow instructions: Adaptive calibration and validation of Tobii eye trackers with the Titta toolbox.增强针对非人类灵长类动物和其他无法遵循指令的受试者的眼动追踪:使用Titta工具箱对托比眼动仪进行自适应校准和验证。
Behav Res Methods. 2024 Dec 4;57(1):4. doi: 10.3758/s13428-024-02540-y.
Behav Res Methods. 2020 Jun;52(3):1140-1160. doi: 10.3758/s13428-019-01307-0.
4
The effects of fixational tremor on the retinal image.注视性震颤对视网膜图像的影响。
J Vis. 2019 Sep 3;19(11):8. doi: 10.1167/19.11.8.
5
gazeNet: End-to-end eye-movement event detection with deep neural networks.gazeNet:基于深度神经网络的端到端眼动事件检测。
Behav Res Methods. 2019 Apr;51(2):840-864. doi: 10.3758/s13428-018-1133-5.
6
Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers.眼动领域是否对注视和扫视感到困惑?对124名研究人员的一项调查。
R Soc Open Sci. 2018 Aug 29;5(8):180502. doi: 10.1098/rsos.180502. eCollection 2018 Aug.
7
The Accuracy and Precision of Position and Orientation Tracking in the HTC Vive Virtual Reality System for Scientific Research.用于科学研究的HTC Vive虚拟现实系统中位置和方向跟踪的准确性与精确性
Iperception. 2017 May 18;8(3):2041669517708205. doi: 10.1177/2041669517708205. eCollection 2017 May-Jun.
8
Using machine learning to detect events in eye-tracking data.使用机器学习检测眼动追踪数据中的事件。
Behav Res Methods. 2018 Feb;50(1):160-181. doi: 10.3758/s13428-017-0860-3.
9
The effect of sampling rate and lowpass filters on saccades - A modeling approach.采样率和低通滤波器对眼跳的影响——一种建模方法。
Behav Res Methods. 2017 Dec;49(6):2146-2162. doi: 10.3758/s13428-016-0848-4.
10
Noise-robust fixation detection in eye movement data: Identification by two-means clustering (I2MC).抗噪眼动数据固视检测:两均值聚类法(I2MC)识别。
Behav Res Methods. 2017 Oct;49(5):1802-1823. doi: 10.3758/s13428-016-0822-1.