Experimental Psychology, Justus Liebig University, Otto-Behaghel-Str. 10 F, 35394, Giessen, Germany.
Center for Mind, Brain and Behavior (CMBB), University of Marburg and Justus Liebig University Giessen, Giessen, Germany.
Behav Res Methods. 2023 Feb;55(2):570-582. doi: 10.3758/s13428-022-01831-6. Epub 2022 Mar 23.
Virtual reality (VR) is a powerful tool for researchers due to its potential to study dynamic human behavior in highly naturalistic environments while retaining full control over the presented stimuli. Due to advancements in consumer hardware, VR devices are now very affordable and have also started to include technologies such as eye tracking, further extending potential research applications. Rendering engines such as Unity, Unreal, or Vizard now enable researchers to easily create complex VR environments. However, implementing the experimental design can still pose a challenge, and these packages do not provide out-of-the-box support for trial-based behavioral experiments. Here, we present a Python toolbox, designed to facilitate common tasks when developing experiments using the Vizard VR platform. It includes functionality for common tasks like creating, randomizing, and presenting trial-based experimental designs or saving results to standardized file formats. Moreover, the toolbox greatly simplifies continuous recording of eye and body movements using any hardware supported in Vizard. We further implement and describe a simple goal-directed reaching task in VR and show sample data recorded from five volunteers. The toolbox, example code, and data are all available on GitHub under an open-source license. We hope that our toolbox can simplify VR experiment development, reduce code duplication, and aid reproducibility and open-science efforts.
虚拟现实(VR)是研究人员的强大工具,因为它有可能在高度自然的环境中研究动态人类行为,同时保持对呈现刺激的完全控制。由于消费类硬件的进步,VR 设备现在非常实惠,并且也开始包括眼动追踪等技术,进一步扩展了潜在的研究应用。Unity、Unreal 或 Vizard 等渲染引擎现在使研究人员能够轻松创建复杂的 VR 环境。然而,实施实验设计仍然可能具有挑战性,并且这些软件包没有为基于试验的行为实验提供开箱即用的支持。在这里,我们提出了一个 Python 工具包,旨在简化使用 Vizard VR 平台开发实验时的常见任务。它包括用于创建、随机化和呈现基于试验的实验设计或将结果保存到标准化文件格式等常见任务的功能。此外,该工具包极大地简化了使用 Vizard 中支持的任何硬件进行连续记录眼动和身体运动的过程。我们进一步在 VR 中实现并描述了一个简单的目标导向的伸手任务,并展示了从五个志愿者记录的示例数据。该工具包、示例代码和数据都可以在 GitHub 上以开源许可证获得。我们希望我们的工具包能够简化 VR 实验开发、减少代码重复,并有助于可重复性和开放科学的努力。