Niehorster Diederick C, Hessels Roy S, Nyström Marcus, Benjamins Jeroen S, Hooge Ignace T C
Lund University Humanities Lab, Lund University, Lund, Sweden.
Department of Psychology, Lund University, Lund, Sweden.
Behav Res Methods. 2025 Jun 3;57(7):188. doi: 10.3758/s13428-025-02704-4.
wearable eye trackers deliver eye-tracking data on a scene video that is acquired by a camera affixed to the participant's head. Analyzing and interpreting such head-centered data is difficult and laborious manual work. Automated methods to map eye-tracking data to a world-centered reference frame (e.g., screens and tabletops) are available. These methods usually make use of fiducial markers. However, such mapping methods may be difficult to implement, expensive, and eye tracker-specific.
here we present gazeMapper, an open-source tool for automated mapping and processing of eye-tracking data. gazeMapper can: (1) Transform head-centered data to planes in the world, (2) synchronize recordings from multiple participants, (3) determine data quality measures, e.g., accuracy and precision. gazeMapper comes with a GUI application (Windows, macOS, and Linux) and supports 11 different wearable eye trackers from AdHawk, Meta, Pupil, SeeTrue, SMI, Tobii, and Viewpointsystem. It is also possible to sidestep the GUI and use gazeMapper as a Python library directly.
可穿戴式眼动仪会根据固定在参与者头部的摄像头采集到的场景视频提供眼动追踪数据。分析和解读此类以头部为中心的数据是一项困难且费力的手工工作。有将眼动追踪数据映射到以世界为中心的参考系(如屏幕和桌面)的自动化方法。这些方法通常会使用基准标记。然而,此类映射方法可能难以实施、成本高昂且特定于眼动仪。
在此,我们展示了gazeMapper,这是一款用于自动映射和处理眼动追踪数据的开源工具。gazeMapper可以:(1)将以头部为中心的数据转换到世界中的平面,(2)同步来自多个参与者的记录,(3)确定数据质量指标,如准确性和精确性。gazeMapper附带一个图形用户界面应用程序(适用于Windows、macOS和Linux),并支持来自AdHawk、Meta、Pupil、SeeTrue、SMI、Tobii和Viewpointsystem的11种不同的可穿戴式眼动仪。也可以绕过图形用户界面,直接将gazeMapper用作Python库。