• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

注视误差估计与线性变换以提高基于视频的眼动仪的准确性。

Gaze Error Estimation and Linear Transformation to Improve Accuracy of Video-Based Eye Trackers.

作者信息

Padikal Varun, Plonkowski Alex, Lawton Penelope F, Young Laura K, Read Jenny C A

机构信息

Department of Biosciences, Newcastle University, Newcastle upon Tyne NE1 7RU, UK.

School of Medicine, Newcastle University, Newcastle upon Tyne NE1 7RU, UK.

出版信息

Vision (Basel). 2025 Apr 3;9(2):29. doi: 10.3390/vision9020029.

DOI:10.3390/vision9020029
PMID:40265397
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC12015841/
Abstract

Eye tracking technology plays a crucial role in various fields such as psychology, medical training, marketing, and human-computer interaction. However, achieving high accuracy over a larger field of view in eye tracking systems remains a significant challenge, both in free viewing and in a head-stabilized condition. In this paper, we propose a simple approach to improve the accuracy of video-based eye trackers through the implementation of linear coordinate transformations. This method involves applying stretching, shearing, translation, or their combinations to correct gaze accuracy errors. Our investigation shows that re-calibrating the eye tracker via linear transformations significantly improves the accuracy of video-based tracker over a large field of view.

摘要

眼动追踪技术在心理学、医学培训、市场营销和人机交互等各个领域都发挥着至关重要的作用。然而,在眼动追踪系统中,要在更大的视野范围内实现高精度,无论是在自由观看还是头部稳定的情况下,仍然是一项重大挑战。在本文中,我们提出了一种简单的方法,通过实施线性坐标变换来提高基于视频的眼动追踪器的准确性。该方法包括应用拉伸、剪切、平移或它们的组合来校正注视准确性误差。我们的研究表明,通过线性变换重新校准眼动追踪器可显著提高基于视频的追踪器在大视野范围内的准确性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/c67eba046a00/vision-09-00029-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/27b516de8b76/vision-09-00029-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/e3f57842d546/vision-09-00029-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/1d0855837d46/vision-09-00029-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/89fcb74f1af3/vision-09-00029-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/b138352e8007/vision-09-00029-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/2cac5306da6c/vision-09-00029-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/ee9068909bd2/vision-09-00029-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/76904d67024d/vision-09-00029-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/72726a749fc7/vision-09-00029-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/0c6e6342fa13/vision-09-00029-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/c67eba046a00/vision-09-00029-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/27b516de8b76/vision-09-00029-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/e3f57842d546/vision-09-00029-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/1d0855837d46/vision-09-00029-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/89fcb74f1af3/vision-09-00029-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/b138352e8007/vision-09-00029-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/2cac5306da6c/vision-09-00029-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/ee9068909bd2/vision-09-00029-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/76904d67024d/vision-09-00029-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/72726a749fc7/vision-09-00029-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/0c6e6342fa13/vision-09-00029-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c39c/12015841/c67eba046a00/vision-09-00029-g011.jpg

相似文献

1
Gaze Error Estimation and Linear Transformation to Improve Accuracy of Video-Based Eye Trackers.注视误差估计与线性变换以提高基于视频的眼动仪的准确性。
Vision (Basel). 2025 Apr 3;9(2):29. doi: 10.3390/vision9020029.
2
From lab-based studies to eye-tracking in virtual and real worlds: conceptual and methodological problems and solutions. Symposium 4 at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 20.8.2019.从基于实验室的研究到虚拟和现实世界中的眼动追踪:概念与方法问题及解决方案。2019年8月20日于阿利坎特举行的第20届欧洲眼动研究会议(ECEM)上的研讨会4。
J Eye Mov Res. 2019 Nov 25;12(7). doi: 10.16910/jemr.12.7.8.
3
High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems.高效重标定的高精度 3D 注视估计用于头戴式注视跟踪系统。
Sensors (Basel). 2022 Jun 8;22(12):4357. doi: 10.3390/s22124357.
4
The impact of slippage on the data quality of head-worn eye trackers.头戴式眼动仪中滑动对数据质量的影响。
Behav Res Methods. 2020 Jun;52(3):1140-1160. doi: 10.3758/s13428-019-01307-0.
5
What to expect from your remote eye-tracker when participants are unrestrained.当被试者不受约束时,您可以从远程眼动追踪器中获得哪些信息。
Behav Res Methods. 2018 Feb;50(1):213-227. doi: 10.3758/s13428-017-0863-0.
6
Replacing eye trackers in ongoing studies: A comparison of eye-tracking data quality between the Tobii Pro TX300 and the Tobii Pro Spectrum.在正在进行的研究中替换眼动追踪器:Tobii Pro TX300 和 Tobii Pro Spectrum 之间的眼动追踪数据质量比较。
Infancy. 2022 Jan;27(1):25-45. doi: 10.1111/infa.12441. Epub 2021 Oct 22.
7
Quantitative comparison of a mobile, tablet-based eye-tracker and two stationary, video-based eye-trackers.基于移动平板电脑的眼动仪与两款基于视频的固定眼动仪的定量比较。
Behav Res Methods. 2025 Jan 6;57(1):45. doi: 10.3758/s13428-024-02542-w.
8
Evaluating the Tobii Pro Glasses 2 and 3 in static and dynamic conditions.评估 Tobii Pro Glasses 2 和 3 在静态和动态条件下的表现。
Behav Res Methods. 2024 Aug;56(5):4221-4238. doi: 10.3758/s13428-023-02173-7. Epub 2023 Aug 7.
9
An investigation of the distribution of gaze estimation errors in head mounted gaze trackers using polynomial functions.使用多项式函数对头戴式视线追踪器中视线估计误差分布的研究。
J Eye Mov Res. 2018 Jun 30;11(3). doi: 10.16910/jemr.11.3.5.
10
A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000.一种同时评估Pupil Labs眼镜和EyeLink 1000的全新综合眼动追踪测试组。
PeerJ. 2019 Jul 9;7:e7086. doi: 10.7717/peerj.7086. eCollection 2019.

本文引用的文献

1
RELAY: Robotic EyeLink AnalYsis of the EyeLink 1000 Using an Artificial Eye.RELAY:使用人工眼对EyeLink 1000进行机器人化EyeLink分析
Vision (Basel). 2025 Mar 1;9(1):18. doi: 10.3390/vision9010018.
2
Eye tracker calibration: How well can humans refixate a target?眼动追踪校准:人类重新注视目标的能力有多强?
Behav Res Methods. 2024 Dec 19;57(1):23. doi: 10.3758/s13428-024-02564-4.
3
Visual Attention, Behavioral Intention, and Choice Behavior Among Older Consumers Toward Sports Marketing Images: An Eye-Tracking Study.老年消费者对体育营销图像的视觉注意力、行为意图和选择行为:一项眼动追踪研究。
Front Psychol. 2022 May 19;13:855089. doi: 10.3389/fpsyg.2022.855089. eCollection 2022.
4
Algorithms for the automated correction of vertical drift in eye-tracking data.用于眼动追踪数据中垂直漂移自动校正的算法。
Behav Res Methods. 2022 Feb;54(1):287-310. doi: 10.3758/s13428-021-01554-0. Epub 2021 Jun 22.
5
Eye tracking in human interaction: Possibilities and limitations.眼动追踪在人类交互中的应用:可能性与局限性。
Behav Res Methods. 2021 Aug;53(4):1592-1608. doi: 10.3758/s13428-020-01517-x. Epub 2021 Jan 6.
6
A "Forbidden Fruit Effect": An Eye-Tracking Study on Children's Visual Attention to Food Marketing.“禁果效应”:对儿童视觉注意食物营销的眼动研究。
Int J Environ Res Public Health. 2020 Mar 13;17(6):1859. doi: 10.3390/ijerph17061859.
7
Eye-Tracking Technology in Plastic and Reconstructive Surgery: A Systematic Review.眼动追踪技术在整形与重建外科中的应用:一项系统综述。
Aesthet Surg J. 2020 Aug 14;40(9):1022-1034. doi: 10.1093/asj/sjz328.
8
Eye-Tracking Technology in Surgical Training.手术训练中的眼动追踪技术。
J Invest Surg. 2019 Nov;32(7):587-593. doi: 10.1080/08941939.2017.1404663. Epub 2017 Dec 18.
9
Eye-tracking technology in medical education: A systematic review.眼动追踪技术在医学教育中的应用:系统评价。
Med Teach. 2018 Jan;40(1):62-69. doi: 10.1080/0142159X.2017.1391373. Epub 2017 Nov 26.
10
A simple algorithm for the offline recalibration of eye-tracking data through best-fitting linear transformation.一种通过最佳拟合线性变换对眼动追踪数据进行离线重新校准的简单算法。
Behav Res Methods. 2015 Dec;47(4):1365-1376. doi: 10.3758/s13428-014-0544-1.