Suppr超能文献

用于支持操作员控制校准的注视估计的不确定性可视化。

Uncertainty visualization of gaze estimation to support operator-controlled calibration.

作者信息

Hassoumi Almoctar, Peysakhovich Vsevolod, Hurter Christophe

机构信息

École Nationale de l'Aviation Civile , France.

ISAE-SUPAEROFrance.

出版信息

J Eye Mov Res. 2018 Jan 25;10(5). doi: 10.16910/jemr.10.5.6.

Abstract

In this paper, we investigate how visualization assets can support the qualitative evaluation of gaze estimation uncertainty. Although eye tracking data are commonly available, little has been done to visually investigate the uncertainty of recorded gaze information. This paper tries to fill this gap by using innovative uncertainty computation and visualization. Given a gaze processing pipeline, we estimate the location of this gaze position in the world camera. To do so we developed our own gaze data processing which give us access to every stage of the data transformation and thus the uncertainty computation. To validate our gaze estimation pipeline, we designed an experiment with 12 participants and showed that the correction methods we proposed reduced the Mean Angular Error by about 1.32 cm, aggregating all 12 participants' results. The Mean Angular Error is 0.25° (SD=0.15°) after correction of the estimated gaze. Next, to support the qualitative assessment of this data, we provide a map which codes the actual uncertainty in the user point of view.

摘要

在本文中,我们研究可视化资产如何支持对视点估计不确定性的定性评估。尽管眼动追踪数据通常是可用的,但在视觉上研究记录的注视信息的不确定性方面却做得很少。本文试图通过使用创新的不确定性计算和可视化来填补这一空白。给定一个注视处理管道,我们估计这个注视位置在世界相机中的位置。为此,我们开发了自己的注视数据处理方法,这使我们能够访问数据转换的每个阶段,从而进行不确定性计算。为了验证我们的注视估计管道,我们设计了一个有12名参与者的实验,结果表明,综合所有12名参与者的结果,我们提出的校正方法将平均角误差降低了约1.32厘米。校正估计的注视后,平均角误差为0.25°(标准差=0.15°)。接下来,为了支持对这些数据的定性评估,我们提供了一张地图,该地图从用户的角度对实际不确定性进行编码。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/8331/7141080/e4c8c11d6175/jemr-10-05-f-figure-01.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验