• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

教程:分析虚拟现实中的眼动和头动。

A tutorial: Analyzing eye and head movements in virtual reality.

机构信息

Department of Psychology, University of British Columbia, 2136 West Mall, Vancouver, BC, V6T 1Z4, Canada.

出版信息

Behav Res Methods. 2024 Dec;56(8):8396-8421. doi: 10.3758/s13428-024-02482-5. Epub 2024 Aug 8.

DOI:10.3758/s13428-024-02482-5
PMID:39117987
Abstract

This tutorial provides instruction on how to use the eye tracking technology built into virtual reality (VR) headsets, emphasizing the analysis of head and eye movement data when an observer is situated in the center of an omnidirectional environment. We begin with a brief description of how VR eye movement research differs from previous forms of eye movement research, as well as identifying some outstanding gaps in the current literature. We then introduce the basic methodology used to collect VR eye movement data both in general and with regard to the specific data that we collected to illustrate different analytical approaches. We continue with an introduction of the foundational ideas regarding data analysis in VR, including frames of reference, how to map eye and head position, and event detection. In the next part, we introduce core head and eye data analyses focusing on determining where the head and eyes are directed. We then expand on what has been presented, introducing several novel spatial, spatio-temporal, and temporal head-eye data analysis techniques. We conclude with a reflection on what has been presented, and how the techniques introduced in this tutorial provide the scaffolding for extensions to more complex and dynamic VR environments.

摘要

本教程提供了如何使用虚拟现实 (VR) 头显中内置的眼动追踪技术的说明,重点介绍了当观察者处于全向环境中心时头部和眼睛运动数据的分析。我们首先简要描述了 VR 眼动研究与之前的眼动研究形式有何不同,并指出了当前文献中的一些突出差距。然后,我们介绍了收集 VR 眼动数据的基本方法,包括一般方法和我们为说明不同分析方法而收集的特定数据的方法。接下来,我们介绍了 VR 数据分析的基础思想,包括参考系、如何映射眼睛和头部位置以及事件检测。在下一部分中,我们介绍了核心的头部和眼睛数据分析,重点是确定头部和眼睛的指向。然后,我们扩展了前面介绍的内容,介绍了几种新颖的空间、时空和时间头部-眼睛数据分析技术。最后,我们对前面介绍的内容进行了反思,并讨论了本教程中介绍的技术如何为更复杂和动态的 VR 环境提供扩展的基础。

相似文献

1
A tutorial: Analyzing eye and head movements in virtual reality.教程:分析虚拟现实中的眼动和头动。
Behav Res Methods. 2024 Dec;56(8):8396-8421. doi: 10.3758/s13428-024-02482-5. Epub 2024 Aug 8.
2
Eye Tracking in Virtual Reality.虚拟现实中的眼动追踪
Curr Top Behav Neurosci. 2023;65:73-100. doi: 10.1007/7854_2022_409.
3
Tasks Reflected in the Eyes: Egocentric Gaze-Aware Visual Task Type Recognition in Virtual Reality.眼睛反映的任务:虚拟现实中的自我中心注视感知视觉任务类型识别。
IEEE Trans Vis Comput Graph. 2024 Nov;30(11):7277-7287. doi: 10.1109/TVCG.2024.3456164. Epub 2024 Oct 10.
4
EHTask: Recognizing User Tasks From Eye and Head Movements in Immersive Virtual Reality.EHTask:从沉浸式虚拟现实中的眼睛和头部运动识别用户任务。
IEEE Trans Vis Comput Graph. 2023 Apr;29(4):1992-2004. doi: 10.1109/TVCG.2021.3138902. Epub 2023 Feb 28.
5
The Impact of Virtual Reality Content Characteristics on Cybersickness and Head Movement Patterns.虚拟现实内容特征对晕动症和头部运动模式的影响。
Sensors (Basel). 2025 Jan 2;25(1):215. doi: 10.3390/s25010215.
6
Eye and head movements while encoding and recognizing panoramic scenes in virtual reality.在虚拟现实中编码和识别全景场景时的眼动和头动。
PLoS One. 2023 Feb 17;18(2):e0282030. doi: 10.1371/journal.pone.0282030. eCollection 2023.
7
Exploring Gaze Dynamics in Virtual Reality through Multiscale Entropy Analysis.通过多尺度熵分析探索虚拟现实中的眼动动力学。
Sensors (Basel). 2024 Mar 10;24(6):1781. doi: 10.3390/s24061781.
8
SGaze: A Data-Driven Eye-Head Coordination Model for Realtime Gaze Prediction.SGaze:用于实时眼-头协调预测的基于数据的眼-头协调模型。
IEEE Trans Vis Comput Graph. 2019 May;25(5):2002-2010. doi: 10.1109/TVCG.2019.2899187. Epub 2019 Feb 18.
9
Study protocol for the EYEdentify project: An examination of gaze behaviour in autistic adults using a virtual reality-based paradigm.EYEdentify项目研究方案:使用基于虚拟现实的范式对自闭症成年人的注视行为进行研究。
PLoS One. 2025 Apr 9;20(4):e0316502. doi: 10.1371/journal.pone.0316502. eCollection 2025.
10
Adaptive responses in eye-head-hand coordination following exposures to a virtual environment as a possible space flight analog.作为一种可能的太空飞行模拟,暴露于虚拟环境后眼-头-手协调的适应性反应。
J Gravit Physiol. 2007 Jul;14(1):P83-4.

引用本文的文献

1
Decoding target discriminability and time pressure using eye and head movement features in a foraging search task.在觅食搜索任务中利用眼睛和头部运动特征解码目标可辨别性和时间压力。
Cogn Res Princ Implic. 2025 Aug 22;10(1):53. doi: 10.1186/s41235-025-00657-y.
2
Mass Casualty Incident Training in Immersive Virtual Reality: Quasi-Experimental Evaluation of Multimethod Performance Indicators.沉浸式虚拟现实中的大规模伤亡事件培训:多方法绩效指标的准实验评估
J Med Internet Res. 2025 Jan 27;27:e63241. doi: 10.2196/63241.

本文引用的文献

1
Eye and head movements while encoding and recognizing panoramic scenes in virtual reality.在虚拟现实中编码和识别全景场景时的眼动和头动。
PLoS One. 2023 Feb 17;18(2):e0282030. doi: 10.1371/journal.pone.0282030. eCollection 2023.
2
Spatiotemporal image quality of virtual reality head mounted displays.虚拟现实头戴式显示器的时空图像质量。
Sci Rep. 2022 Nov 24;12(1):20235. doi: 10.1038/s41598-022-24345-9.
3
The importance of peripheral vision when searching 3D real-world scenes: A gaze-contingent study in virtual reality.
搜索三维真实场景时周边视觉的重要性:虚拟现实中的一项基于注视的研究。
J Vis. 2021 Jul 6;21(7):3. doi: 10.1167/jov.21.7.3.
4
REMoDNaV: robust eye-movement classification for dynamic stimulation.REMoDNaV:用于动态刺激的强大眼动分类。
Behav Res Methods. 2021 Feb;53(1):399-414. doi: 10.3758/s13428-020-01428-x.
5
Task-dependence in scene perception: Head unrestrained viewing using mobile eye-tracking.任务依赖的场景感知:使用移动眼动追踪进行无约束头部观看。
J Vis. 2020 May 11;20(5):3. doi: 10.1167/jov.20.5.3.
6
Differences in eye movement range based on age and gaze direction.基于年龄和注视方向的眼球运动范围差异。
Eye (Lond). 2019 Jul;33(7):1145-1151. doi: 10.1038/s41433-019-0376-4. Epub 2019 Mar 5.
7
Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers.眼动领域是否对注视和扫视感到困惑?对124名研究人员的一项调查。
R Soc Open Sci. 2018 Aug 29;5(8):180502. doi: 10.1098/rsos.180502. eCollection 2018 Aug.
8
Saliency in VR: How Do People Explore Virtual Environments?虚拟现实中的显著性:人们如何探索虚拟环境?
IEEE Trans Vis Comput Graph. 2018 Apr;24(4):1633-1642. doi: 10.1109/TVCG.2018.2793599.
9
Is human classification by experienced untrained observers a gold standard in fixation detection?经验丰富的未受过训练的观察者进行的人类分类是否是注视点检测的金标准?
Behav Res Methods. 2018 Oct;50(5):1864-1881. doi: 10.3758/s13428-017-0955-x.
10
Are fixations in static natural scenes a useful predictor of attention in the real world?静态自然场景中的注视点能否有效预测现实世界中的注意力?
Can J Exp Psychol. 2017 Jun;71(2):172-181. doi: 10.1037/cep0000125.