• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种具有双曲面半反镜的无视差宽视场眼球标记记录器及基于外观的注视估计方法

A Wide-View Parallax-Free Eye-Mark Recorder with a Hyperboloidal Half-Silvered Mirror and Appearance-Based Gaze Estimation.

出版信息

IEEE Trans Vis Comput Graph. 2011 Jul;17(7):900-12. doi: 10.1109/TVCG.2010.113. Epub 2010 Aug 26.

DOI:10.1109/TVCG.2010.113
PMID:20733233
Abstract

In this paper, we propose a wide-view parallax-free eye-mark recorder with a hyperboloidal half-silvered mirror and a gaze estimation method suitable for the device. Our eye-mark recorder provides a wide field-of-view video recording of the user's exact view by positioning the focal point of the mirror at the user's viewpoint. The vertical angle of view of the prototype is 122 degree (elevation and depression angles are 38 and 84 degree, respectively) and its horizontal view angle is 116 degree (nasal and temporal view angles are 38 and 78 degree, respectively). We implemented and evaluated a gaze estimation method for our eye-mark recorder. We use an appearance-based approach for our eye-mark recorder to support a wide field-of-view. We apply principal component analysis (PCA) and multiple regression analysis (MRA) to determine the relationship between the captured images and their corresponding gaze points. Experimental results verify that our eye-mark recorder successfully captures a wide field-of-view of a user and estimates gaze direction with an angular accuracy of around 2 to 4 degree.

摘要

在本文中,我们提出了一种带有双曲半反镜的宽视场无视差眼标记记录器和一种适用于该设备的注视估计方法。我们的眼标记记录器通过将镜子的焦点定位在用户的视点上,为用户的准确视图提供了宽视场的视频记录。原型的垂直视角为 122 度(仰角和俯角分别为 38 度和 84 度),水平视角为 116 度(鼻侧和颞侧视角分别为 38 度和 78 度)。我们为我们的眼标记记录器实现并评估了一种注视估计方法。我们使用基于外观的方法来支持我们的眼标记记录器的宽视场。我们应用主成分分析(PCA)和多元回归分析(MRA)来确定捕获的图像与其相应注视点之间的关系。实验结果验证了我们的眼标记记录器成功地捕捉到了用户的宽视场,并以大约 2 到 4 度的角度精度估计了注视方向。

相似文献

1
A Wide-View Parallax-Free Eye-Mark Recorder with a Hyperboloidal Half-Silvered Mirror and Appearance-Based Gaze Estimation.一种具有双曲面半反镜的无视差宽视场眼球标记记录器及基于外观的注视估计方法
IEEE Trans Vis Comput Graph. 2011 Jul;17(7):900-12. doi: 10.1109/TVCG.2010.113. Epub 2010 Aug 26.
2
Appearance-based gaze estimation using visual saliency.基于视觉显著性的表观注视估计。
IEEE Trans Pattern Anal Mach Intell. 2013 Feb;35(2):329-41. doi: 10.1109/TPAMI.2012.101.
3
Mobile three dimensional gaze tracking.移动式三维视线跟踪
Stud Health Technol Inform. 2011;163:616-22.
4
Measuring and tracking eye movements of a behaving archer fish by real-time stereo vision.利用实时立体视觉测量和跟踪行为箭鱼的眼球运动。
J Neurosci Methods. 2009 Nov 15;184(2):235-43. doi: 10.1016/j.jneumeth.2009.08.006. Epub 2009 Aug 19.
5
Gaze Estimation From Eye Appearance: A Head Pose-Free Method via Eye Image Synthesis.基于眼部表观的注视估计:一种无需头部姿态信息的眼部图像合成方法。
IEEE Trans Image Process. 2015 Nov;24(11):3680-93. doi: 10.1109/TIP.2015.2445295. Epub 2015 Jun 12.
6
Quantitative measurement of eyestrain on 3D stereoscopic display considering the eye foveation model and edge information.基于眼窝注视模型和边缘信息的3D立体显示器眼疲劳定量测量
Sensors (Basel). 2014 May 15;14(5):8577-604. doi: 10.3390/s140508577.
7
Recording gaze trajectory of wheelchair users by a spherical camera.使用球形相机记录轮椅使用者的注视轨迹。
IEEE Int Conf Rehabil Robot. 2017 Jul;2017:929-934. doi: 10.1109/ICORR.2017.8009368.
8
TORNADO: omnistereo video imaging with rotating optics.龙卷风:采用旋转光学的全景立体视频成像技术。
IEEE Trans Vis Comput Graph. 2005 Nov-Dec;11(6):614-25. doi: 10.1109/TVCG.2005.107.
9
Estimation of Gaze Detection Accuracy Using the Calibration Information-Based Fuzzy System.基于校准信息的模糊系统对注视检测准确率的估计
Sensors (Basel). 2016 Jan 5;16(1):60. doi: 10.3390/s16010060.
10
Long-Range Gaze Tracking System for Large Movements.用于大动作的远距离注视跟踪系统
IEEE Trans Biomed Eng. 2013 Dec;60(12):3432-40. doi: 10.1109/TBME.2013.2266413. Epub 2013 Jun 6.