• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于模糊系统的近红外摄像头凝视追踪器目标选择

Fuzzy System-Based Target Selection for a NIR Camera-Based Gaze Tracker.

作者信息

Naqvi Rizwan Ali, Arsalan Muhammad, Park Kang Ryoung

机构信息

Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 100-715, Korea.

出版信息

Sensors (Basel). 2017 Apr 14;17(4):862. doi: 10.3390/s17040862.

DOI:10.3390/s17040862
PMID:28420114
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5424739/
Abstract

Gaze-based interaction (GBI) techniques have been a popular subject of research in the last few decades. Among other applications, GBI can be used by persons with disabilities to perform everyday tasks, as a game interface, and can play a pivotal role in the human computer interface (HCI) field. While gaze tracking systems have shown high accuracy in GBI, detecting a user's gaze for target selection is a challenging problem that needs to be considered while using a gaze detection system. Past research has used the blinking of the eyes for this purpose as well as dwell time-based methods, but these techniques are either inconvenient for the user or requires a long time for target selection. Therefore, in this paper, we propose a method for fuzzy system-based target selection for near-infrared (NIR) camera-based gaze trackers. The results of experiments performed in addition to tests of the usability and on-screen keyboard use of the proposed method show that it is better than previous methods.

摘要

基于注视的交互(GBI)技术在过去几十年一直是热门的研究课题。在其他应用中,GBI可供残疾人用于执行日常任务、作为游戏界面,并且在人机交互(HCI)领域可以发挥关键作用。虽然注视跟踪系统在GBI中已显示出高精度,但在使用注视检测系统时,检测用户用于目标选择的注视是一个需要考虑的具有挑战性的问题。过去的研究为此使用了眼睛眨眼以及基于停留时间的方法,但这些技术要么对用户不方便,要么目标选择需要很长时间。因此,在本文中,我们提出了一种基于模糊系统的目标选择方法,用于基于近红外(NIR)相机的注视跟踪器。除了对所提出方法的可用性和屏幕键盘使用进行测试之外,所进行的实验结果表明它比以前的方法更好。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/4038da4aed74/sensors-17-00862-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/45491b659b3f/sensors-17-00862-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/3d3e72b4727e/sensors-17-00862-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/96f4f38b25b6/sensors-17-00862-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/03dbd8c8f28a/sensors-17-00862-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/2971a4326a52/sensors-17-00862-g005a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/973404ebbe9d/sensors-17-00862-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/058c48c2bd10/sensors-17-00862-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/d09cef573798/sensors-17-00862-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/564a1a6fca7a/sensors-17-00862-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/2bbf55ff3c80/sensors-17-00862-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/19e360c7318a/sensors-17-00862-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/26a33c76f59e/sensors-17-00862-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/11fdb8d486e5/sensors-17-00862-g013a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/d4b29cdbc6cc/sensors-17-00862-g014a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/fac9c8a2ffe2/sensors-17-00862-g015a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/029bf9b70c81/sensors-17-00862-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/0607b6809d20/sensors-17-00862-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/90a24be31a24/sensors-17-00862-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/12b824e3b29e/sensors-17-00862-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/4038da4aed74/sensors-17-00862-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/45491b659b3f/sensors-17-00862-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/3d3e72b4727e/sensors-17-00862-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/96f4f38b25b6/sensors-17-00862-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/03dbd8c8f28a/sensors-17-00862-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/2971a4326a52/sensors-17-00862-g005a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/973404ebbe9d/sensors-17-00862-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/058c48c2bd10/sensors-17-00862-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/d09cef573798/sensors-17-00862-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/564a1a6fca7a/sensors-17-00862-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/2bbf55ff3c80/sensors-17-00862-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/19e360c7318a/sensors-17-00862-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/26a33c76f59e/sensors-17-00862-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/11fdb8d486e5/sensors-17-00862-g013a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/d4b29cdbc6cc/sensors-17-00862-g014a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/fac9c8a2ffe2/sensors-17-00862-g015a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/029bf9b70c81/sensors-17-00862-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/0607b6809d20/sensors-17-00862-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/90a24be31a24/sensors-17-00862-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/12b824e3b29e/sensors-17-00862-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/37e7/5424739/4038da4aed74/sensors-17-00862-g020.jpg

相似文献

1
Fuzzy System-Based Target Selection for a NIR Camera-Based Gaze Tracker.基于模糊系统的近红外摄像头凝视追踪器目标选择
Sensors (Basel). 2017 Apr 14;17(4):862. doi: 10.3390/s17040862.
2
Estimation of Gaze Detection Accuracy Using the Calibration Information-Based Fuzzy System.基于校准信息的模糊系统对注视检测准确率的估计
Sensors (Basel). 2016 Jan 5;16(1):60. doi: 10.3390/s16010060.
3
Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.基于用户头部运动信息的注视跟踪相机设计实证研究
Sensors (Basel). 2016 Aug 31;16(9):1396. doi: 10.3390/s16091396.
4
Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor.基于深度学习的汽车驾驶员凝视检测系统,使用近红外相机传感器。
Sensors (Basel). 2018 Feb 3;18(2):456. doi: 10.3390/s18020456.
5
A Gaze-Based Virtual Keyboard Using a Mouth Switch for Command Selection.一种使用口控开关进行命令选择的基于注视的虚拟键盘。
Annu Int Conf IEEE Eng Med Biol Soc. 2018 Jul;2018:3334-3337. doi: 10.1109/EMBC.2018.8512929.
6
Behavioral Activity Recognition Based on Gaze Ethograms.基于注视行为图谱的行为活动识别。
Int J Neural Syst. 2020 Jul;30(7):2050025. doi: 10.1142/S0129065720500252. Epub 2020 Jun 9.
7
Etracker: A Mobile Gaze-Tracking System with Near-Eye Display Based on a Combined Gaze-Tracking Algorithm.Etracker:一种基于组合注视跟踪算法的近眼显示移动注视跟踪系统。
Sensors (Basel). 2018 May 19;18(5):1626. doi: 10.3390/s18051626.
8
Design and application of real-time visual attention model for the exploration of 3D virtual environments.三维虚拟环境探索的实时视觉注意模型的设计与应用。
IEEE Trans Vis Comput Graph. 2012 Mar;18(3):356-68. doi: 10.1109/TVCG.2011.154.
9
A neural-based remote eye gaze tracker under natural head motion.一种基于神经的自然头部运动下的远程视线追踪器。
Comput Methods Programs Biomed. 2008 Oct;92(1):66-78. doi: 10.1016/j.cmpb.2008.06.008. Epub 2008 Jul 30.
10
A Functional Usability Analysis of Appearance-Based Gaze Tracking for Accessibility.用于无障碍访问的基于外观的注视跟踪功能可用性分析。
Proc Eye Track Res Appl Symp. 2024 Jun;2024. doi: 10.1145/3649902.3656363. Epub 2024 Jun 4.

引用本文的文献

1
Review and Evaluation of Eye Movement Event Detection Algorithms.眼动事件检测算法的回顾与评估。
Sensors (Basel). 2022 Nov 15;22(22):8810. doi: 10.3390/s22228810.
2
Significant Measures of Gaze and Pupil Movement for Evaluating Empathy between Viewers and Digital Content.用于评估观众与数字内容之间同理心的显著注视和瞳孔运动测量方法。
Sensors (Basel). 2022 Feb 22;22(5):1700. doi: 10.3390/s22051700.
3
On the Improvement of Eye Tracking-Based Cognitive Workload Estimation Using Aggregation Functions.基于眼动追踪的认知负荷估计的改进方法:聚合函数的应用。

本文引用的文献

1
Predicting the eye fixation locations in the gray scale images in the visual scenes with different semantic contents.预测具有不同语义内容的视觉场景中灰度图像的眼睛注视位置。
Cogn Neurodyn. 2016 Feb;10(1):31-47. doi: 10.1007/s11571-015-9357-x. Epub 2015 Oct 7.
2
Learning a Combined Model of Visual Saliency for Fixation Prediction.学习一种视觉显著度的联合模型用于注视点预测。
IEEE Trans Image Process. 2016 Apr;25(4):1566-79. doi: 10.1109/TIP.2016.2522380. Epub 2016 Jan 27.
3
Salient Object Detection: A Benchmark.显著目标检测:基准
Sensors (Basel). 2021 Jul 2;21(13):4542. doi: 10.3390/s21134542.
IEEE Trans Image Process. 2015 Dec;24(12):5706-22. doi: 10.1109/TIP.2015.2487833. Epub 2015 Oct 7.
4
Saccadic model of eye movements for free-viewing condition.自由观看条件下眼动的扫视模型。
Vision Res. 2015 Nov;116(Pt B):152-64. doi: 10.1016/j.visres.2014.12.026. Epub 2015 Feb 24.
5
Nonwearable gaze tracking system for controlling home appliance.用于控制家用电器的非穿戴式视线跟踪系统。
ScientificWorldJournal. 2014;2014:303670. doi: 10.1155/2014/303670. Epub 2014 Sep 14.
6
Bayesian saliency via low and mid level cues.基于低、中层次线索的贝叶斯显著性。
IEEE Trans Image Process. 2013 May;22(5):1689-98. doi: 10.1109/TIP.2012.2216276. Epub 2012 Aug 30.
7
Quantitative analysis of human-model agreement in visual saliency modeling: a comparative study.人类模型在视觉显著性建模中的一致性定量分析:一项比较研究。
IEEE Trans Image Process. 2013 Jan;22(1):55-69. doi: 10.1109/TIP.2012.2210727. Epub 2012 Jul 30.
8
State-of-the-art in visual attention modeling.视觉注意建模的最新进展。
IEEE Trans Pattern Anal Mach Intell. 2013 Jan;35(1):185-207. doi: 10.1109/TPAMI.2012.89.
9
Alternative communication systems for people with severe motor disabilities: a survey.重度运动残疾人士的替代沟通系统:一项调查。
Biomed Eng Online. 2011 Apr 20;10:31. doi: 10.1186/1475-925X-10-31.
10
An Image Statistics-Based Model for Fixation Prediction.一种基于图像统计的注视点预测模型。
Cognit Comput. 2011 Mar;3(1):94-104. doi: 10.1007/s12559-010-9087-7. Epub 2010 Dec 14.