• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于计算机视觉的心理学注视估计现场测试。

A field test of computer-vision-based gaze estimation in psychology.

机构信息

Experimental Psychology, Helmholtz Institute, Utrecht University, Heidelberglaan 1, 3584 CS, Utrecht, the Netherlands.

Lund University Humanities Lab, Lund University, Lund, Sweden.

出版信息

Behav Res Methods. 2024 Mar;56(3):1900-1915. doi: 10.3758/s13428-023-02125-1. Epub 2023 Apr 26.

DOI:10.3758/s13428-023-02125-1
PMID:37101100
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10990994/
Abstract

Computer-vision-based gaze estimation refers to techniques that estimate gaze direction directly from video recordings of the eyes or face without the need for an eye tracker. Although many such methods exist, their validation is often found in the technical literature (e.g., computer science conference papers). We aimed to (1) identify which computer-vision-based gaze estimation methods are usable by the average researcher in fields such as psychology or education, and (2) evaluate these methods. We searched for methods that do not require calibration and have clear documentation. Two toolkits, OpenFace and OpenGaze, were found to fulfill these criteria. First, we present an experiment where adult participants fixated on nine stimulus points on a computer screen. We filmed their face with a camera and processed the recorded videos with OpenFace and OpenGaze. We conclude that OpenGaze is accurate and precise enough to be used in screen-based experiments with stimuli separated by at least 11 degrees of gaze angle. OpenFace was not sufficiently accurate for such situations but can potentially be used in sparser environments. We then examined whether OpenFace could be used with horizontally separated stimuli in a sparse environment with infant participants. We compared dwell measures based on OpenFace estimates to the same measures based on manual coding. We conclude that OpenFace gaze estimates may potentially be used with measures such as relative total dwell time to sparse, horizontally separated areas of interest, but should not be used to draw conclusions about measures such as dwell duration.

摘要

基于计算机视觉的眼动追踪是指通过对眼睛或面部的视频记录进行直接估计眼动方向,而无需使用眼动追踪器的技术。虽然有许多这样的方法,但它们的验证通常在技术文献中(例如计算机科学会议论文)中找到。我们的目的是:(1) 确定哪些基于计算机视觉的眼动追踪方法可供心理学或教育等领域的普通研究人员使用,以及 (2) 对这些方法进行评估。我们搜索了不需要校准且具有明确文档的方法。发现了两个符合这些标准的工具包,即 OpenFace 和 OpenGaze。首先,我们进行了一项实验,其中成年参与者将目光固定在计算机屏幕上的九个刺激点上。我们使用摄像机拍摄他们的面部,并使用 OpenFace 和 OpenGaze 处理录制的视频。我们得出的结论是,OpenGaze 足够准确和精确,可以在具有至少 11 度注视角度的刺激物的屏幕实验中使用。OpenFace 在这种情况下不够准确,但在稀疏环境中可能会被使用。然后,我们检查了 OpenFace 是否可以在稀疏环境中与具有水平分离刺激物的婴儿参与者一起使用。我们将基于 OpenFace 估计的停留测量值与基于手动编码的相同测量值进行了比较。我们得出的结论是,OpenFace 眼动追踪估计值可能会与稀疏、水平分离的感兴趣区域的相对总停留时间等测量值一起使用,但不应该用于得出关于停留持续时间等测量值的结论。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/9a8fbf5952de/13428_2023_2125_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/f3808bd09ed1/13428_2023_2125_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/9a2eaba68638/13428_2023_2125_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/b9e3def0bcc4/13428_2023_2125_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/de94dca2b6ee/13428_2023_2125_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/5c578b1c30c6/13428_2023_2125_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/953552bf5441/13428_2023_2125_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/c52086639699/13428_2023_2125_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/85dfb998fcf3/13428_2023_2125_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/9a8fbf5952de/13428_2023_2125_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/f3808bd09ed1/13428_2023_2125_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/9a2eaba68638/13428_2023_2125_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/b9e3def0bcc4/13428_2023_2125_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/de94dca2b6ee/13428_2023_2125_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/5c578b1c30c6/13428_2023_2125_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/953552bf5441/13428_2023_2125_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/c52086639699/13428_2023_2125_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/85dfb998fcf3/13428_2023_2125_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a99d/10990994/9a8fbf5952de/13428_2023_2125_Fig9_HTML.jpg

相似文献

1
A field test of computer-vision-based gaze estimation in psychology.基于计算机视觉的心理学注视估计现场测试。
Behav Res Methods. 2024 Mar;56(3):1900-1915. doi: 10.3758/s13428-023-02125-1. Epub 2023 Apr 26.
2
Eye-tracking glasses in face-to-face interactions: Manual versus automated assessment of areas-of-interest.面对面互动中的眼动追踪眼镜:手动与自动评估兴趣区域。
Behav Res Methods. 2021 Oct;53(5):2037-2048. doi: 10.3758/s13428-021-01544-2. Epub 2021 Mar 19.
3
When I Look into Your Eyes: A Survey on Computer Vision Contributions for Human Gaze Estimation and Tracking.当我凝视你的双眼:计算机视觉在人类视线估计和追踪中的应用综述。
Sensors (Basel). 2020 Jul 3;20(13):3739. doi: 10.3390/s20133739.
4
Salient features in gaze-aligned recordings of human visual input during free exploration of natural environments.在自然环境自由探索过程中人类视觉输入的注视对齐记录中的显著特征。
J Vis. 2008 Oct 23;8(14):12.1-17. doi: 10.1167/8.14.12.
5
A naturalistic viewing paradigm using 360° panoramic video clips and real-time field-of-view changes with eye-gaze tracking.使用 360°全景视频剪辑和实时视野变化并结合眼动追踪的自然观察范式。
Neuroimage. 2020 Aug 1;216:116617. doi: 10.1016/j.neuroimage.2020.116617. Epub 2020 Feb 10.
6
Gaze Focalization System for Driving Applications Using OpenFace 2.0 Toolkit with NARMAX Algorithm in Accidental Scenarios.基于 OpenFace 2.0 工具包和 NARMAX 算法的注视焦点系统在意外场景下的驾驶应用
Sensors (Basel). 2021 Sep 18;21(18):6262. doi: 10.3390/s21186262.
7
Remote Data Collection During a Pandemic: A New Approach for Assessing and Coding Multisensory Attention Skills in Infants and Young Children.疫情期间的远程数据收集:评估和编码婴幼儿多感官注意力技能的新方法。
Front Psychol. 2022 Jan 21;12:731618. doi: 10.3389/fpsyg.2021.731618. eCollection 2021.
8
Offline Calibration for Infant Gaze and Head Tracking across a Wide Horizontal Visual Field.婴儿水平全视场视线和头部跟踪的离线校准。
Sensors (Basel). 2023 Jan 14;23(2):972. doi: 10.3390/s23020972.
9
A novel approach to 3-D gaze tracking using stereo cameras.一种使用立体相机进行三维注视跟踪的新方法。
IEEE Trans Syst Man Cybern B Cybern. 2004 Feb;34(1):234-45. doi: 10.1109/tsmcb.2003.811128.
10
A probabilistic approach to online eye gaze tracking without explicit personal calibration.一种无需显式个人校准的在线眼动追踪概率方法。
IEEE Trans Image Process. 2015 Mar;24(3):1076-86. doi: 10.1109/TIP.2014.2383326.

引用本文的文献

1
Automated Infant Eye Tracking: A Systematic Historical Review.自动婴儿眼动追踪:系统的历史回顾
Infancy. 2025 Jul-Aug;30(4):e70031. doi: 10.1111/infa.70031.
2
LEyes: A lightweight framework for deep learning-based eye tracking using synthetic eye images.LEyes:一个使用合成眼睛图像进行基于深度学习的眼动追踪的轻量级框架。
Behav Res Methods. 2025 Mar 31;57(5):129. doi: 10.3758/s13428-025-02645-y.
3
GazeCapsNet: A Lightweight Gaze Estimation Framework.凝视胶囊网络:一种轻量级凝视估计框架。

本文引用的文献

1
Appearance-Based Gaze Estimation With Deep Learning: A Review and Benchmark.基于外观的深度学习注视估计:综述与基准测试
IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):7509-7528. doi: 10.1109/TPAMI.2024.3393571. Epub 2024 Nov 6.
2
Eye tracking: empirical foundations for a minimal reporting guideline.眼动追踪:最小报告规范的实证基础。
Behav Res Methods. 2023 Jan;55(1):364-416. doi: 10.3758/s13428-021-01762-8. Epub 2022 Apr 6.
3
Remote Data Collection During a Pandemic: A New Approach for Assessing and Coding Multisensory Attention Skills in Infants and Young Children.
Sensors (Basel). 2025 Feb 17;25(4):1224. doi: 10.3390/s25041224.
4
The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study.眼动追踪基础 第4部分:进行眼动追踪研究的工具。
Behav Res Methods. 2025 Jan 6;57(1):46. doi: 10.3758/s13428-024-02529-7.
5
Exploration of factors affecting webcam-based automated gaze coding.基于网络摄像头的自动化眼动追踪编码影响因素的探索。
Behav Res Methods. 2024 Oct;56(7):7374-7390. doi: 10.3758/s13428-024-02424-1. Epub 2024 May 1.
6
Model-Based 3D Gaze Estimation Using a TOF Camera.基于模型的使用飞行时间相机的3D注视估计
Sensors (Basel). 2024 Feb 6;24(4):1070. doi: 10.3390/s24041070.
疫情期间的远程数据收集:评估和编码婴幼儿多感官注意力技能的新方法。
Front Psychol. 2022 Jan 21;12:731618. doi: 10.3389/fpsyg.2021.731618. eCollection 2021.
4
Replacing eye trackers in ongoing studies: A comparison of eye-tracking data quality between the Tobii Pro TX300 and the Tobii Pro Spectrum.在正在进行的研究中替换眼动追踪器:Tobii Pro TX300 和 Tobii Pro Spectrum 之间的眼动追踪数据质量比较。
Infancy. 2022 Jan;27(1):25-45. doi: 10.1111/infa.12441. Epub 2021 Oct 22.
5
The pupil-size artefact (PSA) across time, viewing direction, and different eye trackers.瞳孔大小伪迹(PSA)随时间、观察方向和不同眼动仪的变化。
Behav Res Methods. 2021 Oct;53(5):1986-2006. doi: 10.3758/s13428-020-01512-2. Epub 2021 Mar 11.
6
Advances in Relating Eye Movements and Cognition.眼动与认知关系的进展
Infancy. 2004 Sep;6(2):267-274. doi: 10.1207/s15327078in0602_7. Epub 2004 Sep 1.
7
Accelerating eye movement research via accurate and affordable smartphone eye tracking.通过精确且经济实惠的智能手机眼动追踪加速眼动研究。
Nat Commun. 2020 Sep 11;11(1):4553. doi: 10.1038/s41467-020-18360-5.
8
A Critical Test of Temporal and Spatial Accuracy of the Tobii T60XL Eye Tracker.对托比T60XL眼动仪时间和空间精度的关键测试。
Infancy. 2012 Jan;17(1):9-32. doi: 10.1111/j.1532-7078.2011.00089.x. Epub 2011 Aug 29.
9
Advances in Eye Tracking in Infancy Research.婴儿研究中眼动追踪技术的进展。
Infancy. 2012 Jan;17(1):1-8. doi: 10.1111/j.1532-7078.2011.00101.x. Epub 2011 Nov 1.
10
Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data.在眼动追踪数据中,对注视位置信号进行特征化描述,并对注视期间的噪声进行合成。
Behav Res Methods. 2020 Dec;52(6):2515-2534. doi: 10.3758/s13428-020-01400-9.