• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

GazeBaseVR,一个大规模、纵向、双眼追踪虚拟现实数据集。

GazeBaseVR, a large-scale, longitudinal, binocular eye-tracking dataset collected in virtual reality.

机构信息

Texas State University, Department of Computer Science, San Marcos, TX, 78666, USA.

出版信息

Sci Data. 2023 Mar 30;10(1):177. doi: 10.1038/s41597-023-02075-5.

DOI:10.1038/s41597-023-02075-5
PMID:36997558
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10060927/
Abstract

We present GazeBaseVR, a large-scale, longitudinal, binocular eye-tracking (ET) dataset collected at 250 Hz with an ET-enabled virtual-reality (VR) headset. GazeBaseVR comprises 5,020 binocular recordings from a diverse population of 407 college-aged participants. Participants were recorded up to six times each over a 26-month period, each time performing a series of five different ET tasks: (1) a vergence task, (2) a horizontal smooth pursuit task, (3) a video-viewing task, (4) a self-paced reading task, and (5) a random oblique saccade task. Many of these participants have also been recorded for two previously published datasets with different ET devices, and 11 participants were recorded before and after COVID-19 infection and recovery. GazeBaseVR is suitable for a wide range of research on ET data in VR devices, especially eye movement biometrics due to its large population and longitudinal nature. In addition to ET data, additional participant details are provided to enable further research on topics such as fairness.

摘要

我们展示了 GazeBaseVR,这是一个大规模的、纵向的、带有双眼追踪(ET)功能的虚拟现实(VR)头戴设备以 250Hz 的频率采集的眼动追踪数据集。GazeBaseVR 包含了来自 407 名不同年龄段的参与者的 5020 个双眼记录。参与者在 26 个月的时间里,最多每人进行了六次记录,每次进行五个不同的 ET 任务:(1)聚散任务,(2)水平平滑追随任务,(3)视频观看任务,(4)自主阅读任务,(5)随机斜向扫视任务。其中许多参与者也在之前的两个使用不同 ET 设备的数据集上进行了记录,并且有 11 名参与者在 COVID-19 感染和康复前后进行了记录。GazeBaseVR 非常适合在 VR 设备中进行广泛的 ET 数据研究,特别是眼动生物识别,因为它具有较大的人群和纵向性质。除了 ET 数据,还提供了额外的参与者详细信息,以支持有关公平性等主题的进一步研究。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d34/10063636/1909076cff54/41597_2023_2075_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d34/10063636/ee9c68d40a3b/41597_2023_2075_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d34/10063636/5c06ce746909/41597_2023_2075_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d34/10063636/cb9f8057d33c/41597_2023_2075_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d34/10063636/1909076cff54/41597_2023_2075_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d34/10063636/ee9c68d40a3b/41597_2023_2075_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d34/10063636/5c06ce746909/41597_2023_2075_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d34/10063636/cb9f8057d33c/41597_2023_2075_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d34/10063636/1909076cff54/41597_2023_2075_Fig4_HTML.jpg

相似文献

1
GazeBaseVR, a large-scale, longitudinal, binocular eye-tracking dataset collected in virtual reality.GazeBaseVR,一个大规模、纵向、双眼追踪虚拟现实数据集。
Sci Data. 2023 Mar 30;10(1):177. doi: 10.1038/s41597-023-02075-5.
2
Exploring Gaze Dynamics in Virtual Reality through Multiscale Entropy Analysis.通过多尺度熵分析探索虚拟现实中的眼动动力学。
Sensors (Basel). 2024 Mar 10;24(6):1781. doi: 10.3390/s24061781.
3
GazeBase, a large-scale, multi-stimulus, longitudinal eye movement dataset.GazeBase,一个大规模、多刺激、纵向眼动数据集。
Sci Data. 2021 Jul 16;8(1):184. doi: 10.1038/s41597-021-00959-y.
4
Eye Tracking in Virtual Reality.虚拟现实中的眼动追踪
Curr Top Behav Neurosci. 2023;65:73-100. doi: 10.1007/7854_2022_409.
5
Assessing the attentional bias of smokers in a virtual reality anti-saccade task using eye tracking.使用眼动追踪评估虚拟现实反扫视任务中吸烟者的注意力偏向。
Biol Psychol. 2022 Jul;172:108381. doi: 10.1016/j.biopsycho.2022.108381. Epub 2022 Jun 14.
6
Validation of virtual reality system based on eye-tracking technologies to support clinical assessment of glaucoma.基于眼动追踪技术的虚拟现实系统验证,以支持青光眼的临床评估。
Eur J Ophthalmol. 2021 Nov;31(6):3080-3086. doi: 10.1177/1120672120976047. Epub 2020 Nov 24.
7
Differential diagnosis of vergence and saccade disorders in dyslexia.阅读障碍的会聚和扫视障碍的鉴别诊断。
Sci Rep. 2020 Dec 17;10(1):22116. doi: 10.1038/s41598-020-79089-1.
8
Effects of Immersive Virtual Reality Headset Viewing on Young Children: Visuomotor Function, Postural Stability, and Motion Sickness.沉浸式虚拟现实头戴设备观看对幼儿的影响:视动功能、姿势稳定性和运动病。
Am J Ophthalmol. 2020 Jan;209:151-159. doi: 10.1016/j.ajo.2019.07.020. Epub 2019 Aug 1.
9
Tasks Reflected in the Eyes: Egocentric Gaze-Aware Visual Task Type Recognition in Virtual Reality.眼睛反映的任务:虚拟现实中的自我中心注视感知视觉任务类型识别。
IEEE Trans Vis Comput Graph. 2024 Nov;30(11):7277-7287. doi: 10.1109/TVCG.2024.3456164. Epub 2024 Oct 10.
10
EHTask: Recognizing User Tasks From Eye and Head Movements in Immersive Virtual Reality.EHTask:从沉浸式虚拟现实中的眼睛和头部运动识别用户任务。
IEEE Trans Vis Comput Graph. 2023 Apr;29(4):1992-2004. doi: 10.1109/TVCG.2021.3138902. Epub 2023 Feb 28.

引用本文的文献

1
Exploring, walking, and interacting in virtual reality with simulated low vision: a living contextual dataset.在虚拟现实中借助模拟低视力进行探索、行走和互动:一个生动的情境数据集。
Sci Data. 2025 Feb 25;12(1):330. doi: 10.1038/s41597-025-04560-5.
2
Unstable foveation's impact on reading, object tracking, and its implications for diagnosing and intervening in reading difficulties.不稳定注视对阅读、物体追踪的影响及其对阅读困难诊断和干预的意义。
Sci Rep. 2025 Feb 24;15(1):6546. doi: 10.1038/s41598-024-83316-4.
3
Exploring Gaze Dynamics in Virtual Reality through Multiscale Entropy Analysis.

本文引用的文献

1
Eye Movement Alterations in Post-COVID-19 Condition: A Proof-of-Concept Study.新冠后遗症患者的眼球运动改变:一项概念验证研究。
Sensors (Basel). 2022 Feb 14;22(4):1481. doi: 10.3390/s22041481.
2
Small head movements increase and colour noise in data from five video-based P-CR eye trackers.来自 5 个基于视频的 P-CR 眼动追踪器的数据中,小的头部运动会增加并使颜色噪声。
Behav Res Methods. 2022 Apr;54(2):845-863. doi: 10.3758/s13428-021-01648-9. Epub 2021 Aug 6.
3
GazeBase, a large-scale, multi-stimulus, longitudinal eye movement dataset.GazeBase,一个大规模、多刺激、纵向眼动数据集。
通过多尺度熵分析探索虚拟现实中的眼动动力学。
Sensors (Basel). 2024 Mar 10;24(6):1781. doi: 10.3390/s24061781.
Sci Data. 2021 Jul 16;8(1):184. doi: 10.1038/s41597-021-00959-y.
4
DGaze: CNN-Based Gaze Prediction in Dynamic Scenes.DGaze:基于 CNN 的动态场景中的注视预测。
IEEE Trans Vis Comput Graph. 2020 May;26(5):1902-1911. doi: 10.1109/TVCG.2020.2973473. Epub 2020 Feb 13.
5
A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000.一种同时评估Pupil Labs眼镜和EyeLink 1000的全新综合眼动追踪测试组。
PeerJ. 2019 Jul 9;7:e7086. doi: 10.7717/peerj.7086. eCollection 2019.
6
Analysis of human vergence dynamics.人类聚散动力学分析。
J Vis. 2012 Oct 25;12(11):21. doi: 10.1167/12.11.21.
7
Recording eye movements with video-oculography and scleral search coils: a direct comparison of two methods.使用视频眼动描记术和巩膜搜索线圈记录眼动:两种方法的直接比较。
J Neurosci Methods. 2002 Mar 15;114(2):185-95. doi: 10.1016/s0165-0270(01)00527-1.