Demšar Urška, Çöltekin Arzu
School of Geography & Sustainable Development, University of St Andrews, St Andrews, Scotland, United Kingdom.
Department of Geography, University of Zurich, Zurich, Switzerland.
PLoS One. 2017 Aug 4;12(8):e0181818. doi: 10.1371/journal.pone.0181818. eCollection 2017.
Eye movements provide insights into what people pay attention to, and therefore are commonly included in a variety of human-computer interaction studies. Eye movement recording devices (eye trackers) produce gaze trajectories, that is, sequences of gaze location on the screen. Despite recent technological developments that enabled more affordable hardware, gaze data are still costly and time consuming to collect, therefore some propose using mouse movements instead. These are easy to collect automatically and on a large scale. If and how these two movement types are linked, however, is less clear and highly debated. We address this problem in two ways. First, we introduce a new movement analytics methodology to quantify the level of dynamic interaction between the gaze and the mouse pointer on the screen. Our method uses volumetric representation of movement, the space-time densities, which allows us to calculate interaction levels between two physically different types of movement. We describe the method and compare the results with existing dynamic interaction methods from movement ecology. The sensitivity to method parameters is evaluated on simulated trajectories where we can control interaction levels. Second, we perform an experiment with eye and mouse tracking to generate real data with real levels of interaction, to apply and test our new methodology on a real case. Further, as our experiment tasks mimics route-tracing when using a map, it is more than a data collection exercise and it simultaneously allows us to investigate the actual connection between the eye and the mouse. We find that there seem to be natural coupling when eyes are not under conscious control, but that this coupling breaks down when instructed to move them intentionally. Based on these observations, we tentatively suggest that for natural tracing tasks, mouse tracking could potentially provide similar information as eye-tracking and therefore be used as a proxy for attention. However, more research is needed to confirm this.
眼动能够让我们了解人们关注的内容,因此在各种人机交互研究中普遍会涉及。眼动记录设备(眼动仪)会生成注视轨迹,即屏幕上注视位置的序列。尽管近期技术发展使得硬件成本降低,但收集注视数据仍然成本高昂且耗时,因此有人提议改用鼠标移动数据。鼠标移动数据易于自动且大规模地收集。然而,这两种移动类型是否相关以及如何相关,尚不清楚且存在很大争议。我们通过两种方式解决这个问题。首先,我们引入一种新的移动分析方法,以量化屏幕上注视与鼠标指针之间的动态交互水平。我们的方法使用移动的体积表示,即时空密度,这使我们能够计算两种物理上不同类型移动之间的交互水平。我们描述了该方法,并将结果与运动生态学中现有的动态交互方法进行比较。在我们能够控制交互水平的模拟轨迹上评估了方法参数的敏感性。其次,我们进行了一项眼动和鼠标跟踪实验,以生成具有实际交互水平的真实数据,在实际案例中应用和测试我们的新方法。此外,由于我们的实验任务模拟了使用地图时的路线追踪,这不仅仅是一次数据收集活动,同时还能让我们研究眼睛与鼠标之间的实际联系。我们发现,当眼睛不受意识控制时,似乎存在自然耦合,但当被指示有意移动眼睛时,这种耦合就会被打破。基于这些观察结果,我们初步认为,对于自然追踪任务,鼠标跟踪可能潜在地提供与眼动追踪类似的信息,因此可作为注意力的替代指标。然而,需要更多研究来证实这一点。