• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

从虚拟现实中的眼动行为识别语言诱导的精神负荷。

Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality.

机构信息

Graduate School of Science and Technology, Nara Institute of Science and Technology, Ikoma 630-0192, Japan.

Department of Computer Science and Information Technology, School of Computing, Engineering and Mathematical Sciences, La Trobe University, Melbourne Campus, Melbourne, VIC 3086, Australia.

出版信息

Sensors (Basel). 2023 Jul 25;23(15):6667. doi: 10.3390/s23156667.

DOI:10.3390/s23156667
PMID:37571449
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10422404/
Abstract

Experiences of virtual reality (VR) can easily break if the method of evaluating subjective user states is intrusive. Behavioral measures are increasingly used to avoid this problem. One such measure is eye tracking, which recently became more standard in VR and is often used for content-dependent analyses. This research is an endeavor to utilize content-independent eye metrics, such as pupil size and blinks, for identifying mental load in VR users. We generated mental load independently from visuals through auditory stimuli. We also defined and measured a new eye metric, focus offset, which seeks to measure the phenomenon of "staring into the distance" without focusing on a specific surface. In the experiment, VR-experienced participants listened to two native and two foreign language stimuli inside a virtual phone booth. The results show that with increasing mental load, relative pupil size on average increased 0.512 SDs (0.118 mm), with 57% reduced variance. To a lesser extent, mental load led to fewer fixations, less voluntary gazing at distracting content, and a larger focus offset as if looking through surfaces (about 0.343 SDs, 5.10 cm). These results are in agreement with previous studies. Overall, we encourage further research on content-independent eye metrics, and we hope that hardware and algorithms will be developed in the future to further increase tracking stability.

摘要

虚拟现实(VR)体验很容易受到干扰,如果评估主观用户状态的方法具有侵入性。行为测量方法越来越多地被用来避免这个问题。其中一种方法是眼动追踪,它最近在 VR 中变得更加标准,并且经常用于依赖内容的分析。这项研究旨在利用不依赖于内容的眼动指标,如瞳孔大小和眨眼,来识别 VR 用户的心理负荷。我们通过听觉刺激从视觉上独立产生心理负荷。我们还定义和测量了一个新的眼动指标,即焦点偏移,旨在衡量“凝视远方”而不专注于特定表面的现象。在实验中,有 VR 体验的参与者在虚拟电话亭内听了两个母语和两个外语刺激。结果表明,随着心理负荷的增加,平均相对瞳孔大小增加了 0.512 个标准差(0.118 毫米),方差减少了 57%。在较小程度上,心理负荷导致注视次数减少,较少有意注视分散注意力的内容,以及更大的焦点偏移,就像透过表面看东西一样(约 0.343 个标准差,5.10 厘米)。这些结果与先前的研究一致。总的来说,我们鼓励对不依赖于内容的眼动指标进行进一步研究,我们希望未来能开发出硬件和算法,以进一步提高跟踪稳定性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/19c05c8118c1/sensors-23-06667-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/c24744fd9910/sensors-23-06667-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/dec54331526c/sensors-23-06667-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/b9f92e0478ab/sensors-23-06667-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/f94fc06c2a74/sensors-23-06667-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/e8265710c02a/sensors-23-06667-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/52bdf6cb8d2a/sensors-23-06667-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/83313da6d5be/sensors-23-06667-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/44480b2c2423/sensors-23-06667-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/b54c275eb8f5/sensors-23-06667-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/83305cd151ef/sensors-23-06667-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/19c05c8118c1/sensors-23-06667-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/c24744fd9910/sensors-23-06667-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/dec54331526c/sensors-23-06667-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/b9f92e0478ab/sensors-23-06667-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/f94fc06c2a74/sensors-23-06667-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/e8265710c02a/sensors-23-06667-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/52bdf6cb8d2a/sensors-23-06667-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/83313da6d5be/sensors-23-06667-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/44480b2c2423/sensors-23-06667-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/b54c275eb8f5/sensors-23-06667-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/83305cd151ef/sensors-23-06667-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/211d/10422404/19c05c8118c1/sensors-23-06667-g011.jpg

相似文献

1
Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality.从虚拟现实中的眼动行为识别语言诱导的精神负荷。
Sensors (Basel). 2023 Jul 25;23(15):6667. doi: 10.3390/s23156667.
2
Eye Tracking in Virtual Reality.虚拟现实中的眼动追踪
Curr Top Behav Neurosci. 2023;65:73-100. doi: 10.1007/7854_2022_409.
3
Comparison of visual fatigue caused by head-mounted display for virtual reality and two-dimensional display using objective and subjective evaluation.比较虚拟现实头戴式显示器和二维显示器引起的视觉疲劳:客观评估和主观评估。
Ergonomics. 2019 Jun;62(6):759-766. doi: 10.1080/00140139.2019.1582805. Epub 2019 Mar 14.
4
Immersive Virtual Reality and Ocular Tracking for Brain Mapping During Awake Surgery: Prospective Evaluation Study.沉浸式虚拟现实和眼球追踪在清醒手术中的脑映射:前瞻性评估研究。
J Med Internet Res. 2021 Mar 24;23(3):e24373. doi: 10.2196/24373.
5
An eye tracking based virtual reality system for use inside magnetic resonance imaging systems.一种用于磁共振成像系统内部的基于眼动追踪的虚拟现实系统。
Sci Rep. 2021 Aug 11;11(1):16301. doi: 10.1038/s41598-021-95634-y.
6
Eye movement characteristics in a mental rotation task presented in virtual reality.虚拟现实中呈现的心理旋转任务中的眼动特征。
Front Neurosci. 2023 Mar 27;17:1143006. doi: 10.3389/fnins.2023.1143006. eCollection 2023.
7
Facial Motion Capture System Based on Facial Electromyogram and Electrooculogram for Immersive Social Virtual Reality Applications.基于面部肌电图和眼电图的沉浸式社交虚拟现实应用的面部运动捕捉系统。
Sensors (Basel). 2023 Mar 29;23(7):3580. doi: 10.3390/s23073580.
8
Using virtual reality-based neurocognitive testing and eye tracking to study naturalistic cognitive-motor performance.使用基于虚拟现实的神经认知测试和眼动追踪技术研究自然认知-运动表现。
Neuropsychologia. 2024 Feb 15;194:108744. doi: 10.1016/j.neuropsychologia.2023.108744. Epub 2023 Dec 8.
9
A review on ergonomics evaluations of virtual reality.虚拟现实的人体工程学评估综述。
Work. 2023;74(3):831-841. doi: 10.3233/WOR-205232.
10
Size and shape constancy in consumer virtual reality.消费者虚拟现实中的大小和形状恒常性。
Behav Res Methods. 2020 Aug;52(4):1587-1598. doi: 10.3758/s13428-019-01336-9.

引用本文的文献

1
A Review of the Use of Gaze and Pupil Metrics to Assess Mental Workload in Gamified and Simulated Sensorimotor Tasks.注视和瞳孔测量在游戏化和模拟感觉运动任务中心理负荷评估中的应用综述。
Sensors (Basel). 2024 Mar 8;24(6):1759. doi: 10.3390/s24061759.

本文引用的文献

1
A Scoping Review of Flow Research.流量研究的范围综述
Front Psychol. 2022 Apr 7;13:815665. doi: 10.3389/fpsyg.2022.815665. eCollection 2022.
2
A Case for Studying Naturalistic Eye and Head Movements in Virtual Environments.一项关于在虚拟环境中研究自然主义眼动和头部运动的案例。
Front Psychol. 2021 Dec 31;12:650693. doi: 10.3389/fpsyg.2021.650693. eCollection 2021.
3
How Reliably Do Eye Parameters Indicate Internal Versus External Attentional Focus?眼动参数能在多大程度上可靠地指示内部与外部注意焦点?
Cogn Sci. 2021 Apr;45(4):e12977. doi: 10.1111/cogs.12977.
4
Eye Tracking in Virtual Reality.虚拟现实中的眼动追踪
J Eye Mov Res. 2019 Apr 5;12(1). doi: 10.16910/jemr.12.1.3.
5
Language Familiarity and Proficiency Leads to Differential Cortical Processing During Translation Between Distantly Related Languages.语言熟悉度和熟练度导致在远亲语言之间翻译时大脑皮层的不同处理方式。
Front Hum Neurosci. 2021 Feb 26;15:593108. doi: 10.3389/fnhum.2021.593108. eCollection 2021.
6
Audio in VR: Effects of a Soundscape and Movement-Triggered Step Sounds on Presence.虚拟现实中的音频:音景和运动触发的脚步声对临场感的影响。
Front Robot AI. 2020 Feb 21;7:20. doi: 10.3389/frobt.2020.00020. eCollection 2020.
7
Eye behavior predicts susceptibility to visual distraction during internally directed cognition.眼睛行为预示着在内部导向认知过程中对视觉干扰的易感性。
Atten Percept Psychophys. 2020 Oct;82(7):3432-3444. doi: 10.3758/s13414-020-02068-1.
8
GazeR: A Package for Processing Gaze Position and Pupil Size Data.GazeR:用于处理注视位置和瞳孔大小数据的软件包。
Behav Res Methods. 2020 Oct;52(5):2232-2255. doi: 10.3758/s13428-020-01374-8.
9
Using biomechanics to investigate the effect of VR on eye vergence system.运用生物力学研究虚拟现实对视远系统的影响。
Appl Ergon. 2019 Nov;81:102883. doi: 10.1016/j.apergo.2019.102883. Epub 2019 Jul 3.
10
Saliency in VR: How Do People Explore Virtual Environments?虚拟现实中的显著性:人们如何探索虚拟环境?
IEEE Trans Vis Comput Graph. 2018 Apr;24(4):1633-1642. doi: 10.1109/TVCG.2018.2793599.