Wolfson Centre for Mathematical Biology, Mathematical Institute, University of Oxford, Radcliffe Observatory Quarter, OX2 6GG, Oxford, UK.
Department of Integrated Mathematical Oncology, Moffitt Cancer Center, Magnolia Drive, 12902, Tampa, USA.
Biomed Eng Online. 2019 May 3;18(1):51. doi: 10.1186/s12938-019-0670-1.
Avoidance to look others in the eye is a characteristic symptom of Autism Spectrum Disorders (ASD), and it has been hypothesised that quantitative monitoring of gaze patterns could be useful to objectively evaluate treatments. However, tools to measure gaze behaviour on a regular basis at a manageable cost are missing. In this paper, we investigated whether a smartphone-based tool could address this problem. Specifically, we assessed the accuracy with which the phone-based, state-of-the-art eye-tracking algorithm iTracker can distinguish between gaze towards the eyes and the mouth of a face displayed on the smartphone screen. This might allow mobile, longitudinal monitoring of gaze aversion behaviour in ASD patients in the future.
We simulated a smartphone application in which subjects were shown an image on the screen and their gaze was analysed using iTracker. We evaluated the accuracy of our set-up across three tasks in a cohort of 17 healthy volunteers. In the first two tasks, subjects were shown different-sized images of a face and asked to alternate their gaze focus between the eyes and the mouth. In the last task, participants were asked to trace out a circle on the screen with their eyes. We confirm that iTracker can recapitulate the true gaze patterns, and capture relative position of gaze correctly, even on a different phone system to what it was trained on. Subject-specific bias can be corrected using an error model informed from the calibration data. We compare two calibration methods and observe that a linear model performs better than a previously proposed support vector regression-based method.
Under controlled conditions it is possible to reliably distinguish between gaze towards the eyes and the mouth with a smartphone-based set-up. However, future research will be required to improve the robustness of the system to roll angle of the phone and distance between the user and the screen to allow deployment in a home setting. We conclude that a smartphone-based gaze-monitoring tool provides promising opportunities for more quantitative monitoring of ASD.
避免注视他人眼睛是自闭症谱系障碍(ASD)的一个特征症状,有人假设,对注视模式进行定量监测可能有助于客观评估治疗效果。然而,目前缺乏能够以合理成本定期进行注视行为测量的工具。本文研究了一种基于智能手机的工具是否可以解决这一问题。具体而言,我们评估了基于手机的最先进眼动追踪算法 iTracker 区分智能手机屏幕上显示的人脸的眼睛和嘴巴的准确性。这可能允许将来在 ASD 患者中进行移动、纵向的注视回避行为监测。
我们模拟了一个智能手机应用程序,在该程序中,屏幕上显示图像,使用 iTracker 分析受试者的注视。我们在 17 名健康志愿者的队列中进行了三个任务来评估我们的设置的准确性。在前两个任务中,向受试者展示了不同大小的人脸图像,并要求他们在眼睛和嘴巴之间交替注视焦点。在最后一个任务中,参与者被要求用眼睛在屏幕上画一个圆。我们证实,iTracker 可以重现真实的注视模式,并正确捕获注视的相对位置,即使是在与训练时不同的手机系统上也是如此。可以使用从校准数据中获得的误差模型来校正特定于受试者的偏差。我们比较了两种校准方法,发现线性模型的性能优于之前提出的基于支持向量回归的方法。
在受控条件下,使用基于智能手机的设置可以可靠地区分注视眼睛和嘴巴。然而,需要进一步的研究来提高系统对手机滚动角度和用户与屏幕之间距离的鲁棒性,以允许在家庭环境中部署。我们得出结论,基于智能手机的注视监测工具为更定量地监测 ASD 提供了有前景的机会。