Suppr超能文献

注视点追踪分析工具 GazeParser:一个开源、跨平台的低成本眼动追踪和分析工具库。

GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis.

机构信息

Ehime University, 3 Bunkyo-cho, Matsuyama, Ehime, 790-8577, Japan.

出版信息

Behav Res Methods. 2013 Sep;45(3):684-95. doi: 10.3758/s13428-012-0286-x.

Abstract

Eye movement analysis is an effective method for research on visual perception and cognition. However, recordings of eye movements present practical difficulties related to the cost of the recording devices and the programming of device controls for use in experiments. GazeParser is an open-source library for low-cost eye tracking and data analysis; it consists of a video-based eyetracker and libraries for data recording and analysis. The libraries are written in Python and can be used in conjunction with PsychoPy and VisionEgg experimental control libraries. Three eye movement experiments are reported on performance tests of GazeParser. These showed that the means and standard deviations for errors in sampling intervals were less than 1 ms. Spatial accuracy ranged from 0.7° to 1.2°, depending on participant. In gap/overlap tasks and antisaccade tasks, the latency and amplitude of the saccades detected by GazeParser agreed with those detected by a commercial eyetracker. These results showed that the GazeParser demonstrates adequate performance for use in psychological experiments.

摘要

眼动分析是研究视觉感知和认知的有效方法。然而,眼动记录在实验中存在实际困难,包括记录设备的成本和设备控制的编程。GazeParser 是一个用于低成本眼动跟踪和数据分析的开源库;它由基于视频的眼动仪和用于数据记录和分析的库组成。这些库是用 Python 编写的,可以与 PsychoPy 和 VisionEgg 实验控制库一起使用。报告了三个眼动实验,用于测试 GazeParser 的性能。结果表明,采样间隔误差的平均值和标准差小于 1 毫秒。空间精度取决于参与者,范围从 0.7°到 1.2°。在缺口/重叠任务和反扫视任务中,GazeParser 检测到的扫视潜伏期和幅度与商业眼动仪检测到的扫视潜伏期和幅度一致。这些结果表明,GazeParser 在心理实验中具有足够的性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5226/3745611/b475d366e701/13428_2012_286_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验