• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种用于人机交互过程中眼动追踪的集成神经网络模型。

An integrated neural network model for eye-tracking during human-computer interaction.

作者信息

Wang Li, Wang Changyuan, Zhang Yu, Gao Lina

机构信息

School of Optoelectronic Engineering, Xi'an Technological University, Xi'an 710000, China.

School of Computer Science, Xi'an Technological University, Xi'an 710000, China.

出版信息

Math Biosci Eng. 2023 Jun 21;20(8):13974-13988. doi: 10.3934/mbe.2023622.

DOI:10.3934/mbe.2023622
PMID:37679119
Abstract

Improving the efficiency of human-computer interaction is one of the critical goals of intelligent aircraft cockpit research. The gaze interaction control method can vastly reduce the manual operation of operators and improve the intellectual level of human-computer interaction. Eye-tracking is the basis of sight interaction, so the performance of eye-tracking will directly affect the outcome of gaze interaction. This paper presents an eye-tracking method suitable for human-computer interaction in an aircraft cockpit, which can now estimate the gaze position of operators on multiple screens based on face images. We use a multi-camera system to capture facial images, so that operators are not limited by the angle of head rotation. To improve the accuracy of gaze estimation, we have constructed a hybrid network. One branch uses the transformer framework to extract the global features of the face images; the other branch uses a convolutional neural network structure to extract the local features of the face images. Finally, the extracted features of the two branches are fused for eye-tracking. The experimental results show that the proposed method not only solves the problem of limited head movement for operators but also improves the accuracy of gaze estimation. In addition, our method has a capture rate of more than 80% for targets of different sizes, which is better than the other compared models.

摘要

提高人机交互效率是智能飞机驾驶舱研究的关键目标之一。注视交互控制方法可以大幅减少操作人员的手动操作,提高人机交互的智能水平。眼动追踪是视觉交互的基础,因此眼动追踪的性能将直接影响注视交互的结果。本文提出了一种适用于飞机驾驶舱人机交互的眼动追踪方法,该方法目前可以基于面部图像估计操作人员在多个屏幕上的注视位置。我们使用多摄像头系统捕捉面部图像,使操作人员不受头部旋转角度的限制。为了提高注视估计的准确性,我们构建了一个混合网络。一个分支使用Transformer框架提取面部图像的全局特征;另一个分支使用卷积神经网络结构提取面部图像的局部特征。最后,将两个分支提取的特征融合用于眼动追踪。实验结果表明,该方法不仅解决了操作人员头部运动受限的问题,还提高了注视估计的准确性。此外,我们的方法对不同大小目标的捕获率超过80%,优于其他对比模型。

相似文献

1
An integrated neural network model for eye-tracking during human-computer interaction.一种用于人机交互过程中眼动追踪的集成神经网络模型。
Math Biosci Eng. 2023 Jun 21;20(8):13974-13988. doi: 10.3934/mbe.2023622.
2
When I Look into Your Eyes: A Survey on Computer Vision Contributions for Human Gaze Estimation and Tracking.当我凝视你的双眼:计算机视觉在人类视线估计和追踪中的应用综述。
Sensors (Basel). 2020 Jul 3;20(13):3739. doi: 10.3390/s20133739.
3
Person-Specific Gaze Estimation from Low-Quality Webcam Images.基于低质量网络摄像头图像的特定人注视估计。
Sensors (Basel). 2023 Apr 20;23(8):4138. doi: 10.3390/s23084138.
4
Gaze Tracking Based on Concatenating Spatial-Temporal Features.基于时空特征拼接的注视跟踪。
Sensors (Basel). 2022 Jan 11;22(2):545. doi: 10.3390/s22020545.
5
Hybrid Eye-Tracking on a Smartphone with CNN Feature Extraction and an Infrared 3D Model.智能手机上的混合眼动追踪:基于 CNN 特征提取和红外 3D 模型。
Sensors (Basel). 2020 Jan 19;20(2):543. doi: 10.3390/s20020543.
6
FreeGaze: A Framework for 3D Gaze Estimation Using Appearance Cues from a Facial Video.FreeGaze:一种基于面部视频外观线索的 3D 注视估计框架。
Sensors (Basel). 2023 Dec 4;23(23):9604. doi: 10.3390/s23239604.
7
Gaze Tracking and Point Estimation Using Low-Cost Head-Mounted Devices.使用低成本头戴式设备进行注视跟踪和点估计。
Sensors (Basel). 2020 Mar 30;20(7):1917. doi: 10.3390/s20071917.
8
Uncertainty-Aware Gaze Tracking for Assisted Living Environments.用于辅助生活环境的不确定感知注视跟踪。
IEEE Trans Image Process. 2023;32:2335-2347. doi: 10.1109/TIP.2023.3253253. Epub 2023 Apr 21.
9
Convolutional Neural Network-Based Technique for Gaze Estimation on Mobile Devices.基于卷积神经网络的移动设备注视估计技术
Front Artif Intell. 2022 Jan 26;4:796825. doi: 10.3389/frai.2021.796825. eCollection 2021.
10
High-Accuracy 3D Gaze Estimation with Efficient Recalibration for Head-Mounted Gaze Tracking Systems.高效重标定的高精度 3D 注视估计用于头戴式注视跟踪系统。
Sensors (Basel). 2022 Jun 8;22(12):4357. doi: 10.3390/s22124357.