• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于虚拟校准点生成的新型视线跟踪方法。

A novel gaze tracking method based on the generation of virtual calibration points.

机构信息

Division of Electronics and Electrical Engineering, Dongguk University, Seoul 100-715, Korea.

出版信息

Sensors (Basel). 2013 Aug 16;13(8):10802-22. doi: 10.3390/s130810802.

DOI:10.3390/s130810802
PMID:23959241
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC3812629/
Abstract

Most conventional gaze-tracking systems require that users look at many points during the initial calibration stage, which is inconvenient for them. To avoid this requirement, we propose a new gaze-tracking method with four important characteristics. First, our gaze-tracking system uses a large screen located at a distance from the user, who wears a lightweight device. Second, our system requires that users look at only four calibration points during the initial calibration stage, during which four pupil centers are noted. Third, five additional points (virtual pupil centers) are generated with a multilayer perceptron using the four actual points (detected pupil centers) as inputs. Fourth, when a user gazes at a large screen, the shape defined by the positions of the four pupil centers is a distorted quadrangle because of the nonlinear movement of the human eyeball. The gaze-detection accuracy is reduced if we map the pupil movement area onto the screen area using a single transform function. We overcame this problem by calculating the gaze position based on multi-geometric transforms using the five virtual points and the four actual points. Experiment results show that the accuracy of the proposed method is better than that of other methods.

摘要

大多数传统的眼动追踪系统要求用户在初始校准阶段注视许多点,这对他们来说很不方便。为了避免这种要求,我们提出了一种具有四个重要特点的新眼动追踪方法。首先,我们的眼动追踪系统使用一个大屏幕,位于用户远处,用户佩戴一个轻量级设备。其次,我们的系统要求用户在初始校准阶段仅注视四个校准点,在此期间记录四个瞳孔中心。第三,使用多层感知器,将四个实际点(检测到的瞳孔中心)作为输入,生成五个额外的点(虚拟瞳孔中心)。第四,当用户注视大屏幕时,由于人眼的非线性运动,四个瞳孔中心位置定义的形状是一个变形的四边形。如果我们使用单个变换函数将瞳孔运动区域映射到屏幕区域,那么眼动检测精度将会降低。我们通过使用五个虚拟点和四个实际点,基于多个几何变换来计算注视位置,从而克服了这个问题。实验结果表明,所提出的方法的准确性优于其他方法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/3ecf64b015af/sensors-13-10802f16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/009b4e9a940e/sensors-13-10802f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/5bc455b07165/sensors-13-10802f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/ae6c0cfaf4bf/sensors-13-10802f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/8279d4725fb8/sensors-13-10802f4a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/88939cba49a0/sensors-13-10802f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/f0aa557dd62b/sensors-13-10802f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/666174da2ca2/sensors-13-10802f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/6a6026c05f29/sensors-13-10802f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/5138ad01e641/sensors-13-10802f9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/bdc32f9fa388/sensors-13-10802f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/911ea5e10479/sensors-13-10802f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/2cd96ad2cb94/sensors-13-10802f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/e95f063c3434/sensors-13-10802f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/43e424d1a3a4/sensors-13-10802f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/1523a934fb6c/sensors-13-10802f15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/3ecf64b015af/sensors-13-10802f16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/009b4e9a940e/sensors-13-10802f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/5bc455b07165/sensors-13-10802f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/ae6c0cfaf4bf/sensors-13-10802f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/8279d4725fb8/sensors-13-10802f4a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/88939cba49a0/sensors-13-10802f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/f0aa557dd62b/sensors-13-10802f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/666174da2ca2/sensors-13-10802f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/6a6026c05f29/sensors-13-10802f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/5138ad01e641/sensors-13-10802f9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/bdc32f9fa388/sensors-13-10802f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/911ea5e10479/sensors-13-10802f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/2cd96ad2cb94/sensors-13-10802f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/e95f063c3434/sensors-13-10802f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/43e424d1a3a4/sensors-13-10802f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/1523a934fb6c/sensors-13-10802f15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8afb/3812629/3ecf64b015af/sensors-13-10802f16.jpg

相似文献

1
A novel gaze tracking method based on the generation of virtual calibration points.基于虚拟校准点生成的新型视线跟踪方法。
Sensors (Basel). 2013 Aug 16;13(8):10802-22. doi: 10.3390/s130810802.
2
Estimation of Gaze Detection Accuracy Using the Calibration Information-Based Fuzzy System.基于校准信息的模糊系统对注视检测准确率的估计
Sensors (Basel). 2016 Jan 5;16(1):60. doi: 10.3390/s16010060.
3
Long-Range Gaze Tracking System for Large Movements.用于大动作的远距离注视跟踪系统
IEEE Trans Biomed Eng. 2013 Dec;60(12):3432-40. doi: 10.1109/TBME.2013.2266413. Epub 2013 Jun 6.
4
A free geometry model-independent neural eye-gaze tracking system.一个自由几何模型独立的神经眼动追踪系统。
J Neuroeng Rehabil. 2012 Nov 16;9:82. doi: 10.1186/1743-0003-9-82.
5
Hand-eye coordination-based implicit re-calibration method for gaze tracking on ultrasound machines: a statistical approach.基于手眼协调的超声机器上注视跟踪的隐式重新校准方法:一种统计方法。
Int J Comput Assist Radiol Surg. 2020 May;15(5):837-845. doi: 10.1007/s11548-020-02143-w. Epub 2020 Apr 22.
6
Head-free, remote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using two stereo-calibrated video cameras.基于瞳孔角膜反射法的免头戴、远程眼动追踪系统,使用两个经过立体标定的摄像机,可轻松进行标定。
IEEE Trans Biomed Eng. 2013 Oct;60(10):2952-60. doi: 10.1109/TBME.2013.2266478. Epub 2013 Jun 6.
7
3D Gaze Estimation Using RGB-IR Cameras.基于 RGB-IR 相机的 3D 注视估计。
Sensors (Basel). 2022 Dec 29;23(1):381. doi: 10.3390/s23010381.
8
Novel eye gaze tracking techniques under natural head movement.自然头部运动下的新型眼动追踪技术
IEEE Trans Biomed Eng. 2007 Dec;54(12):2246-60. doi: 10.1109/tbme.2007.895750.
9
Gaze estimation interpolation methods based on binocular data.基于双目数据的注视估计插值方法。
IEEE Trans Biomed Eng. 2012 Aug;59(8):2235-2243. doi: 10.1109/TBME.2012.2201716. Epub 2012 May 30.
10
Pupil size dynamics during fixation impact the accuracy and precision of video-based gaze estimation.注视过程中瞳孔大小的动态变化会影响基于视频的注视估计的准确性和精确性。
Vision Res. 2016 Jan;118:48-59. doi: 10.1016/j.visres.2014.12.018. Epub 2015 Jan 9.

引用本文的文献

1
Multi-User Identification-Based Eye-Tracking Algorithm Using Position Estimation.基于位置估计的多用户识别眼动追踪算法
Sensors (Basel). 2016 Dec 27;17(1):41. doi: 10.3390/s17010041.
2
A new gaze estimation method considering external light.一种考虑外部光线的新注视估计方法。
Sensors (Basel). 2015 Mar 11;15(3):5935-81. doi: 10.3390/s150305935.
3
Nonwearable gaze tracking system for controlling home appliance.用于控制家用电器的非穿戴式视线跟踪系统。

本文引用的文献

1
Experimental investigations of pupil accommodation factors.瞳孔调节因素的实验研究。
Invest Ophthalmol Vis Sci. 2011 Aug 17;52(9):6478-85. doi: 10.1167/iovs.10-6423.
2
An automatic personal calibration procedure for advanced gaze estimation systems.一种先进的注视估计系统的自动个人校准程序。
IEEE Trans Biomed Eng. 2010 May;57(5):1031-9. doi: 10.1109/TBME.2009.2039351. Epub 2010 Feb 17.
3
Study on eye gaze estimation.眼动注视估计研究
ScientificWorldJournal. 2014;2014:303670. doi: 10.1155/2014/303670. Epub 2014 Sep 14.
IEEE Trans Syst Man Cybern B Cybern. 2002;32(3):332-50. doi: 10.1109/TSMCB.2002.999809.
4
The effects of eye movements, age, and expertise on inattentional blindness.眼球运动、年龄和专业技能对无意视盲的影响。
Conscious Cogn. 2006 Sep;15(3):620-7. doi: 10.1016/j.concog.2006.01.001. Epub 2006 Feb 17.
5
A novel approach to 3-D gaze tracking using stereo cameras.一种使用立体相机进行三维注视跟踪的新方法。
IEEE Trans Syst Man Cybern B Cybern. 2004 Feb;34(1):234-45. doi: 10.1109/tsmcb.2003.811128.