• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种用于对眼注视进行分类的简单非参数方法。

A simple nonparametric method for classifying eye fixations.

作者信息

Mould Matthew S, Foster David H, Amano Kinjiro, Oakley John P

机构信息

School of Electrical and Electronic Engineering, University of Manchester, Manchester, UK.

出版信息

Vision Res. 2012 Mar 15;57:18-25. doi: 10.1016/j.visres.2011.12.006. Epub 2012 Jan 2.

DOI:10.1016/j.visres.2011.12.006
PMID:22227608
Abstract

There is no standard method for classifying eye fixations. Thresholds for speed, acceleration, duration, and stability of point of gaze have each been employed to demarcate data, but they have no commonly accepted values. Here, some general distributional properties of eye movements were used to construct a simple method for classifying fixations, without parametric assumptions or expert judgment. The method was primarily speed-based, but the required optimum speed threshold was derived automatically from individual data for each observer and stimulus with the aid of Tibshirani, Walther, and Hastie's 'gap statistic'. An optimum duration threshold, also derived automatically from individual data, was used to eliminate the effects of instrumental noise. The method was tested on data recorded from a video eye-tracker sampling at 250 frames a second while experimental observers viewed static natural scenes in over 30,000 one-second trials. The resulting classifications were compared with those by three independent expert visual classifiers, with 88-94% agreement, and also against two existing parametric methods. Robustness to instrumental noise and sampling rate were verified in separate simulations. The method was applied to the recorded data to illustrate the variation of mean fixation duration and saccade amplitude across observers and scenes.

摘要

目前尚无用于对眼睛注视进行分类的标准方法。已分别采用注视点的速度、加速度、持续时间和稳定性阈值来划分数据,但这些阈值并没有被普遍接受的值。在此,利用眼动的一些一般分布特性构建了一种用于分类注视的简单方法,无需参数假设或专家判断。该方法主要基于速度,但所需的最佳速度阈值借助蒂布希拉尼、瓦尔瑟和哈斯蒂的“间隙统计量”从每个观察者和刺激的个体数据中自动导出。同样从个体数据中自动导出的最佳持续时间阈值用于消除仪器噪声的影响。该方法在以每秒250帧的速度采样的视频眼动追踪仪记录的数据上进行了测试,实验观察者在超过30000次一秒的试验中观看静态自然场景。将所得分类结果与三位独立的专家视觉分类器的结果进行比较,一致性为88% - 94%,并与两种现有的参数方法进行了对比。在单独的模拟中验证了对仪器噪声和采样率的鲁棒性。该方法应用于记录的数据,以说明不同观察者和场景下平均注视持续时间和扫视幅度的变化。

相似文献

1
A simple nonparametric method for classifying eye fixations.一种用于对眼注视进行分类的简单非参数方法。
Vision Res. 2012 Mar 15;57:18-25. doi: 10.1016/j.visres.2011.12.006. Epub 2012 Jan 2.
2
Saccadic context indicates information processing within visual fixations: evidence from event-related potentials and eye-movements analysis of the distractor effect.扫视语境指示视觉固视内的信息处理:来自相关事件电位和眼动分析的分心效应证据。
Int J Psychophysiol. 2011 Apr;80(1):54-62. doi: 10.1016/j.ijpsycho.2011.01.013. Epub 2011 Feb 1.
3
Coarse-to-fine eye movement strategy in visual search.视觉搜索中从粗到细的眼动策略。
Vision Res. 2007 Aug;47(17):2272-80. doi: 10.1016/j.visres.2007.05.002. Epub 2007 Jul 6.
4
Evidence for two distinct mechanisms directing gaze in natural scenes.在自然场景中引导注视的两种不同机制的证据。
J Vis. 2012 Apr 1;12(4):9. doi: 10.1167/12.4.9.
5
Saliency does not account for fixations to eyes within social scenes.显著性无法解释在社交场景中对眼睛的注视。
Vision Res. 2009 Dec;49(24):2992-3000. doi: 10.1016/j.visres.2009.09.014. Epub 2009 Sep 24.
6
The contribution of low-level features at the centre of gaze to saccade target selection.注视中心的低层次特征对扫视目标选择的贡献。
Vision Res. 2009 Dec;49(24):2918-26. doi: 10.1016/j.visres.2009.09.007. Epub 2009 Sep 16.
7
Quantifying center bias of observers in free viewing of dynamic natural scenes.量化观察者在自由观看动态自然场景时的中心偏差。
J Vis. 2009 Jul 9;9(7):4. doi: 10.1167/9.7.4.
8
Shifts in the retinal image of a visual scene during saccades contribute to the perception of reached gaze direction in humans.扫视过程中视觉场景视网膜图像的变化有助于人类对所及注视方向的感知。
Neurosci Lett. 2004 Feb 26;357(1):29-32. doi: 10.1016/j.neulet.2003.12.038.
9
Eye fixations of deaf and hearing observers in simultaneous communication perception.聋人和听力正常的观察者在同步交流感知中的眼动注视情况。
Ear Hear. 2006 Aug;27(4):331-52. doi: 10.1097/01.aud.0000226248.45263.ad.
10
Object-based attentional selection in scene viewing.场景观看中基于对象的注意选择。
J Vis. 2010 Jul 1;10(8):20. doi: 10.1167/10.8.20.

引用本文的文献

1
Impact of web accessibility on cognitive engagement in individuals without disabilities: Evidence from a psychophysiological study.网络可及性对无残疾个体认知参与度的影响:来自一项心理生理学研究的证据。
PLoS One. 2025 Jul 30;20(7):e0328552. doi: 10.1371/journal.pone.0328552. eCollection 2025.
2
The fundamentals of eye tracking part 4: Tools for conducting an eye tracking study.眼动追踪基础 第4部分:进行眼动追踪研究的工具。
Behav Res Methods. 2025 Jan 6;57(1):46. doi: 10.3758/s13428-024-02529-7.
3
Visual search patterns during exploration of naturalistic scenes are driven by saliency cues in individuals with cerebral visual impairment.
个体的大脑视觉障碍会受到显著线索的驱动,从而影响其在探索自然场景时的视觉搜索模式。
Sci Rep. 2024 Feb 6;14(1):3074. doi: 10.1038/s41598-024-53642-8.
4
Object identification in cerebral visual impairment characterized by gaze behavior and image saliency analysis.基于注视行为和图像显著度分析的脑视觉障碍中的目标识别。
Brain Dev. 2023 Sep;45(8):432-444. doi: 10.1016/j.braindev.2023.05.001. Epub 2023 May 13.
5
Visualising Spatio-Temporal Gaze Characteristics for Exploratory Data Analysis in Clinical Fetal Ultrasound Scans.可视化临床胎儿超声扫描中用于探索性数据分析的时空注视特征
Proc Eye Track Res Appl Symp. 2022 Jun;2022. doi: 10.1145/3517031.3529635. Epub 2022 Jun 8.
6
Eye tracking: empirical foundations for a minimal reporting guideline.眼动追踪:最小报告规范的实证基础。
Behav Res Methods. 2023 Jan;55(1):364-416. doi: 10.3758/s13428-021-01762-8. Epub 2022 Apr 6.
7
Detection of normal and slow saccades using implicit piecewise polynomial approximation.使用隐式分段多项式逼近检测正常扫视和慢速扫视。
J Vis. 2021 Jun 7;21(6):8. doi: 10.1167/jov.21.6.8.
8
Topology for gaze analyses - Raw data segmentation.用于注视分析的拓扑结构 - 原始数据分割
J Eye Mov Res. 2017 Mar 13;10(1). doi: 10.16910/jemr.11.1.2.
9
Interpretable Machine Learning Models for Three-Way Classification of Cognitive Workload Levels for Eye-Tracking Features.用于基于眼动追踪特征的认知工作量水平三向分类的可解释机器学习模型
Brain Sci. 2021 Feb 9;11(2):210. doi: 10.3390/brainsci11020210.
10
An Analysis of Entropy-Based Eye Movement Events Detection.基于熵的眼动事件检测分析
Entropy (Basel). 2019 Jan 24;21(2):107. doi: 10.3390/e21020107.