• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

提高低成本眼动追踪器在康复应用中的可用性。

Enhancing the usability of low-cost eye trackers for rehabilitation applications.

机构信息

Embedded Systems & Robotics, TCS Research and Innovation, Tata Consultancy Services, Kolkata, India.

出版信息

PLoS One. 2018 Jun 1;13(6):e0196348. doi: 10.1371/journal.pone.0196348. eCollection 2018.

DOI:10.1371/journal.pone.0196348
PMID:29856798
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5983534/
Abstract

Eye tracking is one of the most widely used technique for assessment, screening and human-machine interaction related applications. There are certain issues which limit the usage of eye trackers in practical scenarios, viz., i) need to perform multiple calibrations and ii) presence of inherent noise in the recorded data. To address these issues, we have proposed a protocol for one-time calibration against the "regular" or the "multiple" calibration phases. It is seen that though it is always desirable to perform multiple calibration, the one-time calibration also produces comparable results and might be better for individuals who are not able to perform multiple calibrations. In that case, "One-time calibration" can also be done by a participant and the calibration results are used for the rest of the participants, provided the chin rest and the eye tracker positions are unaltered. The second major issue is the presence of the inherent noise in the raw gaze data, leading to systematic and variable errors. We have proposed a signal processing chain to remove these two types of errors. Two different psychological stimuli-based tasks, namely, recall-recognition test and number gazing task are used as a case study for the same. It is seen that the proposed approach gives satisfactory results even with one-time calibration. The study is also extended to test the effect of long duration task on the performance of the proposed algorithm and the results confirm that the proposed methods work well in such scenarios too.

摘要

眼动追踪是评估、筛选和人机交互相关应用中最广泛使用的技术之一。在实际场景中,眼动追踪器的使用受到某些限制,例如:i)需要进行多次校准,ii)记录数据中存在固有噪声。为了解决这些问题,我们提出了一种针对“常规”或“多次”校准阶段的一次性校准协议。虽然多次校准总是可取的,但一次性校准也能产生可比的结果,并且对于无法进行多次校准的个体可能更好。在这种情况下,“一次性校准”也可以由参与者完成,只要下巴托和眼动追踪器的位置不变,校准结果就可以用于其他参与者。第二个主要问题是原始注视数据中存在固有噪声,导致系统和可变误差。我们提出了一种信号处理链来消除这两种类型的误差。我们使用了两种基于不同心理刺激的任务,即回忆识别测试和数字凝视任务,作为相同的案例研究。结果表明,即使进行一次性校准,所提出的方法也能得到令人满意的结果。该研究还扩展到测试长时间任务对所提出算法性能的影响,结果证实所提出的方法在这种情况下也能很好地工作。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/e7d3549fd3a0/pone.0196348.g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/e932121ae1fc/pone.0196348.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/763e503d41b7/pone.0196348.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/9a2d93e512a7/pone.0196348.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/a39792e5070f/pone.0196348.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/78378ac3ce77/pone.0196348.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/1592bc4d0a80/pone.0196348.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/e42fbc1937d4/pone.0196348.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/0f7c556eaf1a/pone.0196348.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/f0f9825a68fd/pone.0196348.g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/6f5e4ea641e7/pone.0196348.g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/452093d85843/pone.0196348.g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/0f1440e6f3bb/pone.0196348.g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/eb93e2c48545/pone.0196348.g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/2dd072293af4/pone.0196348.g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/e13e9da0bfc8/pone.0196348.g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/d05617afc185/pone.0196348.g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/d89e1ee9f000/pone.0196348.g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/30822c12a11a/pone.0196348.g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/996b40f50237/pone.0196348.g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/4f3c56259588/pone.0196348.g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/e7d3549fd3a0/pone.0196348.g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/e932121ae1fc/pone.0196348.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/763e503d41b7/pone.0196348.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/9a2d93e512a7/pone.0196348.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/a39792e5070f/pone.0196348.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/78378ac3ce77/pone.0196348.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/1592bc4d0a80/pone.0196348.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/e42fbc1937d4/pone.0196348.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/0f7c556eaf1a/pone.0196348.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/f0f9825a68fd/pone.0196348.g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/6f5e4ea641e7/pone.0196348.g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/452093d85843/pone.0196348.g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/0f1440e6f3bb/pone.0196348.g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/eb93e2c48545/pone.0196348.g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/2dd072293af4/pone.0196348.g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/e13e9da0bfc8/pone.0196348.g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/d05617afc185/pone.0196348.g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/d89e1ee9f000/pone.0196348.g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/30822c12a11a/pone.0196348.g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/996b40f50237/pone.0196348.g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/4f3c56259588/pone.0196348.g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d94/5983534/e7d3549fd3a0/pone.0196348.g021.jpg

相似文献

1
Enhancing the usability of low-cost eye trackers for rehabilitation applications.提高低成本眼动追踪器在康复应用中的可用性。
PLoS One. 2018 Jun 1;13(6):e0196348. doi: 10.1371/journal.pone.0196348. eCollection 2018.
2
Affordable sensor based gaze tracking for realistic psychological assessment.用于现实心理评估的基于低成本传感器的注视跟踪
Annu Int Conf IEEE Eng Med Biol Soc. 2017 Jul;2017:746-750. doi: 10.1109/EMBC.2017.8036932.
3
Novel eye gaze tracking techniques under natural head movement.自然头部运动下的新型眼动追踪技术
IEEE Trans Biomed Eng. 2007 Dec;54(12):2246-60. doi: 10.1109/tbme.2007.895750.
4
A nonvisual eye tracker calibration method for video-based tracking.一种用于基于视频跟踪的非视觉眼动仪校准方法。
J Vis. 2018 Sep 4;18(9):13. doi: 10.1167/18.9.13.
5
Implicit Calibration Using Probable Fixation Targets.使用可能的注视目标进行内隐校准。
Sensors (Basel). 2019 Jan 8;19(1):216. doi: 10.3390/s19010216.
6
Head movement compensation and multi-modal event detection in eye-tracking data for unconstrained head movements.用于无约束头部运动的眼动追踪数据中的头部运动补偿和多模态事件检测。
J Neurosci Methods. 2016 Dec 1;274:13-26. doi: 10.1016/j.jneumeth.2016.09.005. Epub 2016 Sep 28.
7
What to expect from your remote eye-tracker when participants are unrestrained.当被试者不受约束时,您可以从远程眼动追踪器中获得哪些信息。
Behav Res Methods. 2018 Feb;50(1):213-227. doi: 10.3758/s13428-017-0863-0.
8
Gaze gesture based human robot interaction for laparoscopic surgery.基于注视手势的腹腔镜手术人机交互。
Med Image Anal. 2018 Feb;44:196-214. doi: 10.1016/j.media.2017.11.011. Epub 2017 Nov 28.
9
The impact of slippage on the data quality of head-worn eye trackers.头戴式眼动仪中滑动对数据质量的影响。
Behav Res Methods. 2020 Jun;52(3):1140-1160. doi: 10.3758/s13428-019-01307-0.
10
[Standard technical specifications for methacholine chloride (Methacholine) bronchial challenge test (2023)].[氯化乙酰甲胆碱支气管激发试验标准技术规范(2023年)]
Zhonghua Jie He He Hu Xi Za Zhi. 2024 Feb 12;47(2):101-119. doi: 10.3760/cma.j.cn112147-20231019-00247.

本文引用的文献

1
Using Smooth Pursuit Calibration for Difficult-to-Calibrate Participants.对难以校准的参与者使用平稳跟踪校准。
J Eye Mov Res. 2017 Oct 4;10(4). doi: 10.16910/jemr.10.4.1.
2
Affordable sensor based gaze tracking for realistic psychological assessment.用于现实心理评估的基于低成本传感器的注视跟踪
Annu Int Conf IEEE Eng Med Biol Soc. 2017 Jul;2017:746-750. doi: 10.1109/EMBC.2017.8036932.
3
A simple algorithm for the offline recalibration of eye-tracking data through best-fitting linear transformation.一种通过最佳拟合线性变换对眼动追踪数据进行离线重新校准的简单算法。
Behav Res Methods. 2015 Dec;47(4):1365-1376. doi: 10.3758/s13428-014-0544-1.
4
Clinical correlates of parametric digit-symbol substitution test in schizophrenia.精神分裂症参数数字符号替换测验的临床相关性。
Asian J Psychiatr. 2014 Aug;10:45-50. doi: 10.1016/j.ajp.2014.03.010. Epub 2014 May 9.
5
The worldwide landscape of stroke in the 21st century.21世纪全球中风的概况。
Lancet. 2014 Jan 18;383(9913):195-7. doi: 10.1016/s0140-6736(13)62077-2.
6
Eye-tracking data quality as affected by ethnicity and experimental design.眼动追踪数据质量受种族和实验设计的影响。
Behav Res Methods. 2014 Mar;46(1):67-80. doi: 10.3758/s13428-013-0343-0.
7
The influence of calibration method and eye physiology on eyetracking data quality.校准方法和眼生理因素对眼动追踪数据质量的影响。
Behav Res Methods. 2013 Mar;45(1):272-88. doi: 10.3758/s13428-012-0247-4.
8
Infant eyes: A window on cognitive development.婴儿的眼睛:认知发展的一扇窗口。
Infancy. 2012;17(1):126-140. doi: 10.1111/j.1532-7078.2011.00097.x.
9
Detecting cognitive impairment by eye movement analysis using automatic classification algorithms.使用自动分类算法通过眼动分析检测认知障碍。
J Neurosci Methods. 2011 Sep 30;201(1):196-203. doi: 10.1016/j.jneumeth.2011.06.027. Epub 2011 Jul 27.
10
Mode-of-disparities error correction of eye-tracking data.眼动追踪数据的离群值纠错模式。
Behav Res Methods. 2011 Sep;43(3):834-42. doi: 10.3758/s13428-011-0073-0.