• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用一种新的运动分析方法对空间视觉界面上的注视和鼠标交互进行量化。

Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology.

作者信息

Demšar Urška, Çöltekin Arzu

机构信息

School of Geography & Sustainable Development, University of St Andrews, St Andrews, Scotland, United Kingdom.

Department of Geography, University of Zurich, Zurich, Switzerland.

出版信息

PLoS One. 2017 Aug 4;12(8):e0181818. doi: 10.1371/journal.pone.0181818. eCollection 2017.

DOI:10.1371/journal.pone.0181818
PMID:28777822
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5544210/
Abstract

Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.

摘要

眼动能够让我们了解人们关注的内容,因此在各种人机交互研究中普遍会涉及。眼动记录设备(眼动仪)会生成注视轨迹,即屏幕上注视位置的序列。尽管近期技术发展使得硬件成本降低,但收集注视数据仍然成本高昂且耗时,因此有人提议改用鼠标移动数据。鼠标移动数据易于自动且大规模地收集。然而,这两种移动类型是否相关以及如何相关,尚不清楚且存在很大争议。我们通过两种方式解决这个问题。首先,我们引入一种新的移动分析方法,以量化屏幕上注视与鼠标指针之间的动态交互水平。我们的方法使用移动的体积表示,即时空密度,这使我们能够计算两种物理上不同类型移动之间的交互水平。我们描述了该方法,并将结果与运动生态学中现有的动态交互方法进行比较。在我们能够控制交互水平的模拟轨迹上评估了方法参数的敏感性。其次,我们进行了一项眼动和鼠标跟踪实验,以生成具有实际交互水平的真实数据,在实际案例中应用和测试我们的新方法。此外,由于我们的实验任务模拟了使用地图时的路线追踪,这不仅仅是一次数据收集活动,同时还能让我们研究眼睛与鼠标之间的实际联系。我们发现,当眼睛不受意识控制时,似乎存在自然耦合,但当被指示有意移动眼睛时,这种耦合就会被打破。基于这些观察结果,我们初步认为,对于自然追踪任务,鼠标跟踪可能潜在地提供与眼动追踪类似的信息,因此可作为注意力的替代指标。然而,需要更多研究来证实这一点。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/0c19b11df077/pone.0181818.g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/f534346edd2b/pone.0181818.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/0fc526df1ae8/pone.0181818.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/d806d0289547/pone.0181818.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/1d70b7a91f97/pone.0181818.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/46ccbc77a7a9/pone.0181818.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/1c7ab966a934/pone.0181818.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/636caf49e5c4/pone.0181818.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/4e70204038b4/pone.0181818.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/a37f2d9e9ac6/pone.0181818.g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/4eca860e049c/pone.0181818.g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/a61364194e82/pone.0181818.g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/e8ba4d30052c/pone.0181818.g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/c0599206d4eb/pone.0181818.g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/e36d571c020f/pone.0181818.g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/4f2bc05be14c/pone.0181818.g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/0c19b11df077/pone.0181818.g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/f534346edd2b/pone.0181818.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/0fc526df1ae8/pone.0181818.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/d806d0289547/pone.0181818.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/1d70b7a91f97/pone.0181818.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/46ccbc77a7a9/pone.0181818.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/1c7ab966a934/pone.0181818.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/636caf49e5c4/pone.0181818.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/4e70204038b4/pone.0181818.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/a37f2d9e9ac6/pone.0181818.g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/4eca860e049c/pone.0181818.g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/a61364194e82/pone.0181818.g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/e8ba4d30052c/pone.0181818.g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/c0599206d4eb/pone.0181818.g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/e36d571c020f/pone.0181818.g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/4f2bc05be14c/pone.0181818.g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dd03/5544210/0c19b11df077/pone.0181818.g016.jpg

相似文献

1
Quantifying gaze and mouse interactions on spatial visual interfaces with a new movement analytics methodology.使用一种新的运动分析方法对空间视觉界面上的注视和鼠标交互进行量化。
PLoS One. 2017 Aug 4;12(8):e0181818. doi: 10.1371/journal.pone.0181818. eCollection 2017.
2
Novel eye gaze tracking techniques under natural head movement.自然头部运动下的新型眼动追踪技术
IEEE Trans Biomed Eng. 2007 Dec;54(12):2246-60. doi: 10.1109/tbme.2007.895750.
3
Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability.使用基于神经网络的用户档案进行自适应眼动追踪,以帮助行动不便的人。
J Rehabil Res Dev. 2008;45(6):801-17. doi: 10.1682/jrrd.2007.05.0075.
4
The Effectiveness of Gaze-Contingent Control in Computer Games.注视相关控制在电脑游戏中的有效性
Perception. 2015;44(8-9):1136-45. doi: 10.1177/0301006615594910. Epub 2015 Aug 14.
5
Spatial frequency processing in the central and peripheral visual field during scene viewing.场景观看过程中中央和周边视野的空间频率处理
Vision Res. 2016 Oct;127:186-197. doi: 10.1016/j.visres.2016.05.008. Epub 2016 Sep 17.
6
A novel method for measuring gaze orientation in space in unrestrained head conditions.一种在头部无约束条件下测量空间注视方向的新方法。
J Vis. 2013 Jul 31;13(8):28. doi: 10.1167/13.8.28.
7
Measuring fixation disparity with infrared eye-trackers.使用红外眼动仪测量注视差异。
J Biomed Opt. 2009 Jan-Feb;14(1):014013. doi: 10.1117/1.3077198.
8
Implicit processing during change blindness revealed with mouse-contingent and gaze-contingent displays.通过鼠标依赖和注视依赖显示揭示的变化盲视期间的内隐加工。
Atten Percept Psychophys. 2018 May;80(4):844-859. doi: 10.3758/s13414-017-1468-5.
9
Design and application of real-time visual attention model for the exploration of 3D virtual environments.三维虚拟环境探索的实时视觉注意模型的设计与应用。
IEEE Trans Vis Comput Graph. 2012 Mar;18(3):356-68. doi: 10.1109/TVCG.2011.154.
10
Space-time visual analytics of eye-tracking data for dynamic stimuli.时空视域分析在动态刺激眼动追踪数据中的应用。
IEEE Trans Vis Comput Graph. 2013 Dec;19(12):2129-38. doi: 10.1109/TVCG.2013.194.

引用本文的文献

1
Attention and Information Acquisition: Comparison of Mouse-Click with Eye-Movement Attention Tracking.注意力与信息获取:鼠标点击与眼动注意力追踪的比较
J Eye Mov Res. 2018 Nov 16;11(6). doi: 10.16910/jemr.11.6.4.
2
Potential path volume (PPV): a geometric estimator for space use in 3D.潜在路径体积(PPV):一种用于三维空间使用情况的几何估计器。
Mov Ecol. 2019 Apr 29;7:14. doi: 10.1186/s40462-019-0158-4. eCollection 2019.

本文引用的文献

1
Activity seascapes highlight central place foraging strategies in marine predators that never stop swimming.活动海景突出了从不停止游动的海洋捕食者的中心地觅食策略。
Mov Ecol. 2018 Jun 21;6:9. doi: 10.1186/s40462-018-0127-3. eCollection 2018.
2
Voxelization algorithms for geospatial applications: Computational methods for voxelating spatial datasets of 3D city models containing 3D surface, curve and point data models.用于地理空间应用的体素化算法:对包含三维表面、曲线和点数据模型的三维城市模型空间数据集进行体素化的计算方法。
MethodsX. 2016 Jan 13;3:69-86. doi: 10.1016/j.mex.2016.01.001. eCollection 2016.
3
Analysis and visualisation of movement: an interdisciplinary review.
运动分析与可视化:跨学科综述。
Mov Ecol. 2015 Mar 10;3(1):5. doi: 10.1186/s40462-015-0032-y. eCollection 2015.
4
A critical examination of indices of dynamic interaction for wildlife telemetry studies.对野生动物遥测研究中动态相互作用指标的批判性审视。
J Anim Ecol. 2014 Sep;83(5):1216-33. doi: 10.1111/1365-2656.12198. Epub 2014 Feb 22.
5
Informing disease models with temporal and spatial contact structure among GPS-collared individuals in wild populations.利用 GPS 项圈个体在野生种群中的时间和空间接触结构来构建疾病模型。
PLoS One. 2014 Jan 7;9(1):e84368. doi: 10.1371/journal.pone.0084368. eCollection 2014.
6
Interaction rules underlying group decisions in homing pigeons.群体决策的交互规则在归巢鸽中。
J R Soc Interface. 2013 Sep 25;10(89):20130529. doi: 10.1098/rsif.2013.0529. Print 2013 Dec 6.
7
Space-time visual analytics of eye-tracking data for dynamic stimuli.时空视域分析在动态刺激眼动追踪数据中的应用。
IEEE Trans Vis Comput Graph. 2013 Dec;19(12):2129-38. doi: 10.1109/TVCG.2013.194.
8
Feasibility study on the spatial and temporal movement of Samburu's cattle and wildlife in Kenya using GPS radio-tracking, remote sensing and GIS.利用 GPS 无线电跟踪、遥感和 GIS 技术研究肯尼亚桑布鲁人的牛群和野生动物的时空运动的可行性研究。
Prev Vet Med. 2013 Aug 1;111(1-2):76-80. doi: 10.1016/j.prevetmed.2013.04.007. Epub 2013 May 25.
9
Freedom and rules in human sequential performance: a refractory period in eye-hand coordination.
J Vis. 2013 Mar 11;13(3):4. doi: 10.1167/13.3.4.
10
Yarbus, eye movements, and vision.亚尔布斯、眼球运动与视觉。
Iperception. 2010;1(1):7-27. doi: 10.1068/i0382. Epub 2010 Jul 12.