• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种基于模型的方法,用于眼动追踪中注视校正的几何解决方案。

A model-based method with geometric solutions for gaze correction in eye-tracking.

作者信息

Zheng Xiu Juan, Li Zhan Heng, Chun Xin Yi, Yang Xiao Mei, Liu Kai

机构信息

College of Electrical Engineering, Sichuan University, Chengdu 610065, China.

出版信息

Math Biosci Eng. 2019 Nov 26;17(2):1396-1412. doi: 10.3934/mbe.2020071.

DOI:10.3934/mbe.2020071
PMID:32233585
Abstract

The eyeball distortions caused by eye diseases, such as myopia and strabismus, can lead to the deviations of eye-tracking data. In this paper, a model-based method with geometric solutions is proposed for gaze correction. The deviations of estimated gaze points are geometrically analyzed based on the individual eyeball models with considerations of the distortions caused by myopia and strabismus. A set of integrated geometric solutions is derived from the varied situations including the case of strabismus and the case of myopia and strabismus, and then used for gaze correction in eyetracking. The experimental results demonstrate that this model-based method is effective to reduce deviations in estimated gaze points, and can be used to correct the modeling error in eye-tracking. Moreover, the proposed method has the potential to provide a simple approach to correct the eyetracking data for various populations with eye diseases.

摘要

由近视和斜视等眼部疾病引起的眼球变形会导致眼动追踪数据出现偏差。本文提出了一种基于模型的几何求解方法用于注视校正。基于个体眼球模型,考虑近视和斜视引起的变形,对估计注视点的偏差进行几何分析。从包括斜视情况以及近视和斜视情况等多种情形中推导得出一组综合几何解,然后将其用于眼动追踪中的注视校正。实验结果表明,这种基于模型的方法能有效减少估计注视点的偏差,可用于校正眼动追踪中的建模误差。此外,所提出的方法有可能为患有眼部疾病的各类人群校正眼动追踪数据提供一种简单方法。

相似文献

1
A model-based method with geometric solutions for gaze correction in eye-tracking.一种基于模型的方法,用于眼动追踪中注视校正的几何解决方案。
Math Biosci Eng. 2019 Nov 26;17(2):1396-1412. doi: 10.3934/mbe.2020071.
2
High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems.高效重标定的高精度 3D 注视估计用于头戴式注视跟踪系统。
Sensors (Basel). 2022 Jun 8;22(12):4357. doi: 10.3390/s22124357.
3
Stable Gaze Tracking with Filtering Based on Internet of Things.基于物联网的稳定注视跟踪及滤波
Sensors (Basel). 2022 Apr 20;22(9):3131. doi: 10.3390/s22093131.
4
Dual-Cameras-Based Driver's Eye Gaze Tracking System with Non-Linear Gaze Point Refinement.基于双摄像头的驾驶员眼动追踪系统,具有非线性眼动点细化。
Sensors (Basel). 2022 Mar 17;22(6):2326. doi: 10.3390/s22062326.
5
Gaze Tracking and Point Estimation Using Low-Cost Head-Mounted Devices.使用低成本头戴式设备进行注视跟踪和点估计。
Sensors (Basel). 2020 Mar 30;20(7):1917. doi: 10.3390/s20071917.
6
MRGazer: decoding eye gaze points from functional magnetic resonance imaging in individual space.MRGazer:在个体空间中从功能磁共振成像解码眼动点。
J Neural Eng. 2024 Aug 13;21(4). doi: 10.1088/1741-2552/ad6185.
7
Novel eye gaze tracking techniques under natural head movement.自然头部运动下的新型眼动追踪技术
IEEE Trans Biomed Eng. 2007 Dec;54(12):2246-60. doi: 10.1109/tbme.2007.895750.
8
When I Look into Your Eyes: A Survey on Computer Vision Contributions for Human Gaze Estimation and Tracking.当我凝视你的双眼:计算机视觉在人类视线估计和追踪中的应用综述。
Sensors (Basel). 2020 Jul 3;20(13):3739. doi: 10.3390/s20133739.
9
Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data.在眼动追踪数据中,对注视位置信号进行特征化描述,并对注视期间的噪声进行合成。
Behav Res Methods. 2020 Dec;52(6):2515-2534. doi: 10.3758/s13428-020-01400-9.
10
Cross-talk elimination for lenslet array near eye display based on eye-gaze tracking.基于眼动追踪的微透镜阵列近眼显示串扰消除。
Opt Express. 2022 May 9;30(10):16196-16216. doi: 10.1364/OE.455482.

引用本文的文献

1
Effects of tracker location on the accuracy and precision of the Gazepoint GP3 HD for spectacle wearers.追踪器位置对 Gazepoint GP3 HD 系统用于矫正视力者的准确性和精密度的影响。
Behav Res Methods. 2024 Jan;56(1):43-52. doi: 10.3758/s13428-022-02023-y. Epub 2022 Nov 22.