Graduate Program in Chemistry, University of São Paulo, Institute of Chemistry.
Graduate Program in Applied Mathematics, University of São Paulo, Institute of Mathematics and Statistics.
J Vis Exp. 2024 Oct 18(212). doi: 10.3791/65977.
We present a method for real-time recording of human interaction with three-dimensional (3D) virtual objects. The approach consists of associating rotation data of the manipulated object with behavioral measures, such as eye tracking, to make better inferences about the underlying cognitive processes. The task consists of displaying two identical models of the same 3D object (a molecule), presented on a computer screen: a rotating, interactive object (iObj) and a static, target object (tObj). Participants must rotate iObj using the mouse until they consider its orientation to be identical to that of tObj. The computer tracks all interaction data in real time. The participant's gaze data are also recorded using an eye tracker. The measurement frequency is 10 Hz on the computer and 60 Hz on the eye tracker. The orientation data of iObj with respect to tObj are recorded in rotation quaternions. The gaze data are synchronized to the orientation of iObj and referenced using this same system. This method enables us to obtain the following visualizations of the human interaction process with iObj and tObj: (1) angular disparity synchronized with other time-dependent data; (2) 3D rotation trajectory inside what we decided to call a "ball of rotations"; (3) 3D fixation heatmap. All steps of the protocol have used free software, such as GNU Octave and Jmol, and all scripts are available as supplementary material. With this approach, we can conduct detailed quantitative studies of the task-solving process involving mental or physical rotations, rather than only the outcome reached. It is possible to measure precisely how important each part of the 3D models is for the participant in solving tasks, and thus relate the models to relevant variables such as the characteristics of the objects, cognitive abilities of the individuals, and the characteristics of human-machine interface.
我们提出了一种实时记录人类与三维(3D)虚拟对象交互的方法。该方法包括将所操作对象的旋转数据与行为测量(例如眼动追踪)相关联,以便更好地推断潜在的认知过程。该任务包括在计算机屏幕上显示两个相同的 3D 对象(一个分子)的相同模型:一个旋转的交互式对象(iObj)和一个静态的目标对象(tObj)。参与者必须使用鼠标旋转 iObj,直到他们认为其方向与 tObj 的方向相同。计算机实时跟踪所有交互数据。参与者的注视数据也使用眼动追踪仪记录。计算机的测量频率为 10 Hz,眼动追踪仪的测量频率为 60 Hz。iObj 相对于 tObj 的方向数据以旋转四元数记录。注视数据与 iObj 的方向同步,并使用相同的系统进行参考。这种方法使我们能够获得以下与 iObj 和 tObj 交互过程的可视化:(1)与其他时间相关数据同步的角度差异;(2)在我们称之为“旋转球”内的 3D 旋转轨迹;(3)3D 固定热图。协议的所有步骤都使用了免费软件,例如 GNU Octave 和 Jmol,并且所有脚本都可作为补充材料获得。通过这种方法,我们可以对涉及心理或物理旋转的任务解决过程进行详细的定量研究,而不仅仅是研究达到的结果。可以精确地测量 3D 模型的每个部分对于参与者解决任务的重要性,从而将模型与相关变量(例如对象的特征、个体的认知能力和人机界面的特征)联系起来。