• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

将眼动追踪与头部运动预测集成到移动设备认证中:概念验证。

Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept.

机构信息

School of Cyber Engineering, Xidian University, Xi'an 710071, China.

Shaanxi Key Laboratory of Network and System Security, Xidian University, Xi'an 710071, China.

出版信息

Sensors (Basel). 2018 Aug 31;18(9):2894. doi: 10.3390/s18092894.

DOI:10.3390/s18092894
PMID:30200380
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6164076/
Abstract

We introduce a two-stream model to use reflexive eye movements for smart mobile device authentication. Our model is based on two pre-trained neural networks, and , targeting two independent tasks: (i) gaze tracking and (ii) future frame prediction. We design a procedure to randomly generate the visual stimulus on the screen of mobile device, and the frontal camera will simultaneously capture head motions of the user as one watches it. Then, calculates the gaze-coordinates error which is treated as a . To solve the imprecise gaze-coordinates caused by the low resolution of the frontal camera, we further take advantage of to extract the between consecutive frames. In order to resist traditional attacks (shoulder surfing and impersonation attacks) during the procedure of mobile device authentication, we innovatively combine and to train a 2-class support vector machine (SVM) classifier. The experiment results show that the classifier achieves accuracy of 98.6% to authenticate the user identity of mobile devices.

摘要

我们提出了一种双流模型,利用反射性眼球运动进行智能移动设备认证。我们的模型基于两个预先训练的神经网络和,针对两个独立的任务:(i)注视跟踪和(ii)未来帧预测。我们设计了一个程序,在移动设备的屏幕上随机生成视觉刺激,而前置摄像头将同时捕捉用户的头部运动,以便观看。然后,计算注视坐标误差,将其视为。为了解决由于前置摄像头分辨率低而导致的不精确注视坐标问题,我们进一步利用来提取连续帧之间的。为了抵抗移动设备认证过程中的传统攻击(肩窥和冒充攻击),我们创新性地结合和来训练一个 2 类支持向量机(SVM)分类器。实验结果表明,该分类器在认证移动设备用户身份方面的准确率达到了 98.6%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/64a721f827e2/sensors-18-02894-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/1fd4cbba5896/sensors-18-02894-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/72f0c9a9f284/sensors-18-02894-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/35b495d7b630/sensors-18-02894-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/f00fda1849a4/sensors-18-02894-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/53c9486f20f4/sensors-18-02894-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/76eb8210cab3/sensors-18-02894-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/8f9325e1d343/sensors-18-02894-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/aa67b0f2090c/sensors-18-02894-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/f869d6db3712/sensors-18-02894-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/d5d7ba0e99b2/sensors-18-02894-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/0b58d36df546/sensors-18-02894-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/ef92901b234e/sensors-18-02894-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/77ed575c79c4/sensors-18-02894-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/9c8a61e6fa9b/sensors-18-02894-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/7f3468e4e139/sensors-18-02894-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/64a721f827e2/sensors-18-02894-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/1fd4cbba5896/sensors-18-02894-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/72f0c9a9f284/sensors-18-02894-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/35b495d7b630/sensors-18-02894-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/f00fda1849a4/sensors-18-02894-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/53c9486f20f4/sensors-18-02894-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/76eb8210cab3/sensors-18-02894-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/8f9325e1d343/sensors-18-02894-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/aa67b0f2090c/sensors-18-02894-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/f869d6db3712/sensors-18-02894-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/d5d7ba0e99b2/sensors-18-02894-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/0b58d36df546/sensors-18-02894-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/ef92901b234e/sensors-18-02894-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/77ed575c79c4/sensors-18-02894-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/9c8a61e6fa9b/sensors-18-02894-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/7f3468e4e139/sensors-18-02894-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/00ce/6164076/64a721f827e2/sensors-18-02894-g014.jpg

相似文献

1
Integrating Gaze Tracking and Head-Motion Prediction for Mobile Device Authentication: A Proof of Concept.将眼动追踪与头部运动预测集成到移动设备认证中:概念验证。
Sensors (Basel). 2018 Aug 31;18(9):2894. doi: 10.3390/s18092894.
2
Gaze Tracking and Point Estimation Using Low-Cost Head-Mounted Devices.使用低成本头戴式设备进行注视跟踪和点估计。
Sensors (Basel). 2020 Mar 30;20(7):1917. doi: 10.3390/s20071917.
3
Salient features in gaze-aligned recordings of human visual input during free exploration of natural environments.在自然环境自由探索过程中人类视觉输入的注视对齐记录中的显著特征。
J Vis. 2008 Oct 23;8(14):12.1-17. doi: 10.1167/8.14.12.
4
Novel eye gaze tracking techniques under natural head movement.自然头部运动下的新型眼动追踪技术
IEEE Trans Biomed Eng. 2007 Dec;54(12):2246-60. doi: 10.1109/tbme.2007.895750.
5
Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability.使用基于神经网络的用户档案进行自适应眼动追踪,以帮助行动不便的人。
J Rehabil Res Dev. 2008;45(6):801-17. doi: 10.1682/jrrd.2007.05.0075.
6
An Effective Gaze-Based Authentication Method with the Spatiotemporal Feature of Eye Movement.基于眼动时空特征的有效凝视认证方法。
Sensors (Basel). 2022 Apr 14;22(8):3002. doi: 10.3390/s22083002.
7
Enhancement of the vestibulo-ocular reflex by prior eye movements.先前的眼球运动增强前庭眼反射。
J Neurophysiol. 1999 Jun;81(6):2884-92. doi: 10.1152/jn.1999.81.6.2884.
8
Tracking gaze while walking on a treadmill: spatial accuracy and limits of use of a stationary remote eye-tracker.在跑步机上行走时追踪注视:固定远程眼动仪的空间准确性及使用限制
Annu Int Conf IEEE Eng Med Biol Soc. 2014;2014:3727-30. doi: 10.1109/EMBC.2014.6944433.
9
Mobile gaze tracking system for outdoor walking behavioral studies.用于户外行走行为研究的移动视线跟踪系统。
J Vis. 2016;16(3):27. doi: 10.1167/16.3.27.
10
Gaze estimation interpolation methods based on binocular data.基于双目数据的注视估计插值方法。
IEEE Trans Biomed Eng. 2012 Aug;59(8):2235-2243. doi: 10.1109/TBME.2012.2201716. Epub 2012 May 30.

引用本文的文献

1
Generative model-enhanced human motion prediction.生成模型增强的人体运动预测
Appl AI Lett. 2022 Apr;3(2):e63. doi: 10.1002/ail2.63. Epub 2022 Mar 23.

本文引用的文献

1
BULDP: Biomimetic Uncorrelated Locality Discriminant Projection for Feature Extraction in Face Recognition.BULDP:用于人脸识别特征提取的仿生不相关局部判别投影
IEEE Trans Image Process. 2018 Feb 15. doi: 10.1109/TIP.2018.2806229.
2
Matching Contactless and Contact-based Conventional Fingerprint Images for Biometrics Identification.匹配用于生物识别的非接触式和基于接触的传统指纹图像
IEEE Trans Image Process. 2018 Apr;27(4):2008-2021. doi: 10.1109/TIP.2017.2788866. Epub 2018 Jan 1.
3
State-of-the-art in visual attention modeling.
视觉注意建模的最新进展。
IEEE Trans Pattern Anal Mach Intell. 2013 Jan;35(1):185-207. doi: 10.1109/TPAMI.2012.89.
4
In the eye of the beholder: a survey of models for eyes and gaze.在观察者的眼中:眼睛和注视模型的调查。
IEEE Trans Pattern Anal Mach Intell. 2010 Mar;32(3):478-500. doi: 10.1109/TPAMI.2009.30.
5
Speed and accuracy of saccadic eye movements: characteristics of impulse variability in the oculomotor system.扫视眼动的速度与准确性:动眼系统中冲动变异性的特征
J Exp Psychol Hum Percept Perform. 1989 Aug;15(3):529-43. doi: 10.1037//0096-1523.15.3.529.