• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于 RGB-D 相机的驾驶员注视区域连续估计

Continuous Driver's Gaze Zone Estimation Using RGB-D Camera.

机构信息

Information Science and Technology College, Dalian Maritime University, Dalian 116026, China.

School of Microelectronics, Dalian University of Technology, Dalian 116024, China.

出版信息

Sensors (Basel). 2019 Mar 14;19(6):1287. doi: 10.3390/s19061287.

DOI:10.3390/s19061287
PMID:30875740
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6471141/
Abstract

The driver gaze zone is an indicator of a driver's attention and plays an important role in the driver's activity monitoring. Due to the bad initialization of point-cloud transformation, gaze zone systems using RGB-D cameras and ICP (Iterative Closet Points) algorithm do not work well under long-time head motion. In this work, a solution for a continuous driver gaze zone estimation system in real-world driving situations is proposed, combining multi-zone ICP-based head pose tracking and appearance-based gaze estimation. To initiate and update the coarse transformation of ICP, a particle filter with auxiliary sampling is employed for head state tracking, which accelerates the iterative convergence of ICP. Multiple templates for different gaze zone are applied to balance the templates revision of ICP under large head movement. For the RGB information, an appearance-based gaze estimation method with two-stage neighbor selection is utilized, which treats the gaze prediction as the combination of neighbor query (in head pose and eye image feature space) and linear regression (between eye image feature space and gaze angle space). The experimental results show that the proposed method outperforms the baseline methods on gaze estimation, and can provide a stable head pose tracking for driver behavior analysis in real-world driving scenarios.

摘要

驾驶员注视区域是驾驶员注意力的一个指标,在驾驶员活动监测中起着重要作用。由于点云变换的初始化不良,使用 RGB-D 相机和 ICP(迭代最近点)算法的注视区域系统在长时间头部运动下无法正常工作。在这项工作中,提出了一种在实际驾驶情况下连续驾驶员注视区域估计系统的解决方案,结合基于多区域 ICP 的头部姿势跟踪和基于外观的注视估计。为了初始化和更新 ICP 的粗略变换,采用具有辅助采样的粒子滤波器进行头部状态跟踪,这加速了 ICP 的迭代收敛。为了在大头部运动下平衡 ICP 的模板修正,应用了多个用于不同注视区域的模板。对于 RGB 信息,利用具有两级邻居选择的基于外观的注视估计方法,将注视预测视为邻居查询(在头部姿势和眼睛图像特征空间中)和线性回归(在眼睛图像特征空间和注视角度空间之间)的组合。实验结果表明,所提出的方法在注视估计方面优于基线方法,并且可以为实际驾驶场景中的驾驶员行为分析提供稳定的头部姿势跟踪。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/66f1edbcc87e/sensors-19-01287-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/6f0b29e59a3b/sensors-19-01287-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/462c871f18bd/sensors-19-01287-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/7b0361b8f114/sensors-19-01287-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/fbba5241fdff/sensors-19-01287-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/d3cea25ba6f1/sensors-19-01287-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/44d9c4631955/sensors-19-01287-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/033db1dcc0e3/sensors-19-01287-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/519dc35d74b3/sensors-19-01287-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/732fdab4ed68/sensors-19-01287-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/9814694a4a88/sensors-19-01287-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/51026477b5a2/sensors-19-01287-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/1f57b8dae7dd/sensors-19-01287-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/66f1edbcc87e/sensors-19-01287-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/6f0b29e59a3b/sensors-19-01287-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/462c871f18bd/sensors-19-01287-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/7b0361b8f114/sensors-19-01287-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/fbba5241fdff/sensors-19-01287-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/d3cea25ba6f1/sensors-19-01287-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/44d9c4631955/sensors-19-01287-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/033db1dcc0e3/sensors-19-01287-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/519dc35d74b3/sensors-19-01287-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/732fdab4ed68/sensors-19-01287-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/9814694a4a88/sensors-19-01287-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/51026477b5a2/sensors-19-01287-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/1f57b8dae7dd/sensors-19-01287-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0f33/6471141/66f1edbcc87e/sensors-19-01287-g013.jpg

相似文献

1
Continuous Driver's Gaze Zone Estimation Using RGB-D Camera.基于 RGB-D 相机的驾驶员注视区域连续估计
Sensors (Basel). 2019 Mar 14;19(6):1287. doi: 10.3390/s19061287.
2
Driver's Head Pose and Gaze Zone Estimation Based on Multi-Zone Templates Registration and Multi-Frame Point Cloud Fusion.基于多区域模板配准和多帧点云融合的驾驶员头部姿势和注视区域估计。
Sensors (Basel). 2022 Apr 20;22(9):3154. doi: 10.3390/s22093154.
3
Dual-Cameras-Based Driver's Eye Gaze Tracking System with Non-Linear Gaze Point Refinement.基于双摄像头的驾驶员眼动追踪系统,具有非线性眼动点细化。
Sensors (Basel). 2022 Mar 17;22(6):2326. doi: 10.3390/s22062326.
4
Faster R-CNN and Geometric Transformation-Based Detection of Driver's Eyes Using Multiple Near-Infrared Camera Sensors.基于 Faster R-CNN 和几何变换的使用多个近红外相机传感器的驾驶员眼睛检测。
Sensors (Basel). 2019 Jan 7;19(1):197. doi: 10.3390/s19010197.
5
A Driver Gaze Estimation Method Based on Deep Learning.基于深度学习的驾驶员注视估计方法。
Sensors (Basel). 2022 May 23;22(10):3959. doi: 10.3390/s22103959.
6
Automatic Calibration Method for Driver's Head Orientation in Natural Driving Environment.自然驾驶环境下驾驶员头部方向的自动校准方法
IEEE trans Intell Transp Syst. 2012 Sep 21;14(1):303-310. doi: 10.1109/TITS.2012.2217377.
7
Combining head pose and eye location information for gaze estimation.结合头部姿势和眼睛位置信息进行注视估计。
IEEE Trans Image Process. 2012 Feb;21(2):802-15. doi: 10.1109/TIP.2011.2162740. Epub 2011 Jul 22.
8
Deep Learning-Based Gaze Detection System for Automobile Drivers Using a NIR Camera Sensor.基于深度学习的汽车驾驶员凝视检测系统,使用近红外相机传感器。
Sensors (Basel). 2018 Feb 3;18(2):456. doi: 10.3390/s18020456.
9
Noncontact binocular eye-gaze tracking for point-of-gaze estimation in three dimensions.用于三维注视点估计的非接触式双眼视线跟踪
IEEE Trans Biomed Eng. 2009 Mar;56(3):790-9. doi: 10.1109/TBME.2008.2005943. Epub 2008 Sep 26.
10
Highly Accurate and Fully Automatic 3D Head Pose Estimation and Eye Gaze Estimation Using RGB-3D Sensors and 3D Morphable Models.基于 RGB-3D 传感器和 3D 可变形模型的高精度全自动 3D 头部姿态估计和眼动估计。
Sensors (Basel). 2018 Dec 5;18(12):4280. doi: 10.3390/s18124280.

引用本文的文献

1
Comprehensive Assessment of Artificial Intelligence Tools for Driver Monitoring and Analyzing Safety Critical Events in Vehicles.用于车辆驾驶员监测和安全关键事件分析的人工智能工具综合评估
Sensors (Basel). 2024 Apr 12;24(8):2478. doi: 10.3390/s24082478.
2
Single Camera Face Position-Invariant Driver's Gaze Zone Classifier Based on Frame-Sequence Recognition Using 3D Convolutional Neural Networks.基于 3D 卷积神经网络的帧序列识别的单目摄像机面部位置不变驾驶员注视区域分类器。
Sensors (Basel). 2022 Aug 5;22(15):5857. doi: 10.3390/s22155857.
3
Driver's Head Pose and Gaze Zone Estimation Based on Multi-Zone Templates Registration and Multi-Frame Point Cloud Fusion.

本文引用的文献

1
An Orientation Sensor-Based Head Tracking System for Driver Behaviour Monitoring.基于方位传感器的驾驶员行为监测头跟踪系统。
Sensors (Basel). 2017 Nov 22;17(11):2692. doi: 10.3390/s17112692.
2
Design of a Fatigue Detection System for High-Speed Trains Based on Driver Vigilance Using a Wireless Wearable EEG.基于驾驶员警觉性的高速列车无线可穿戴式脑电图疲劳检测系统设计
Sensors (Basel). 2017 Mar 1;17(3):486. doi: 10.3390/s17030486.
3
Empirical Study on Designing of Gaze Tracking Camera Based on the Information of User's Head Movement.
基于多区域模板配准和多帧点云融合的驾驶员头部姿势和注视区域估计。
Sensors (Basel). 2022 Apr 20;22(9):3154. doi: 10.3390/s22093154.
4
Stable Gaze Tracking with Filtering Based on Internet of Things.基于物联网的稳定注视跟踪及滤波
Sensors (Basel). 2022 Apr 20;22(9):3131. doi: 10.3390/s22093131.
5
Dual-Cameras-Based Driver's Eye Gaze Tracking System with Non-Linear Gaze Point Refinement.基于双摄像头的驾驶员眼动追踪系统,具有非线性眼动点细化。
Sensors (Basel). 2022 Mar 17;22(6):2326. doi: 10.3390/s22062326.
6
Gaze in the Dark: Gaze Estimation in a Low-Light Environment with Generative Adversarial Networks.黑暗中的凝视:基于生成对抗网络的低光照环境下的凝视估计
Sensors (Basel). 2020 Aug 31;20(17):4935. doi: 10.3390/s20174935.
基于用户头部运动信息的注视跟踪相机设计实证研究
Sensors (Basel). 2016 Aug 31;16(9):1396. doi: 10.3390/s16091396.
4
Compensation Method of Natural Head Movement for Gaze Tracking System Using an Ultrasonic Sensor for Distance Measurement.使用超声波传感器进行距离测量的注视跟踪系统中自然头部运动的补偿方法
Sensors (Basel). 2016 Jan 16;16(1):110. doi: 10.3390/s16010110.
5
Integration of Body Sensor Networks and Vehicular Ad-hoc Networks for Traffic Safety.用于交通安全的人体传感器网络与车载自组织网络的集成
Sensors (Basel). 2016 Jan 15;16(1):107. doi: 10.3390/s16010107.
6
Adaptive Linear Regression for Appearance-Based Gaze Estimation.基于外观的视线估计的自适应线性回归。
IEEE Trans Pattern Anal Mach Intell. 2014 Oct;36(10):2033-46. doi: 10.1109/TPAMI.2014.2313123.
7
Head Pose Estimation on Top of Haar-Like Face Detection: A Study Using the Kinect Sensor.基于类 Haar 人脸检测的头部姿态估计:一项使用 Kinect 传感器的研究
Sensors (Basel). 2015 Aug 26;15(9):20945-66. doi: 10.3390/s150920945.
8
An investigation on the feasibility of uncalibrated and unconstrained gaze tracking for human assistive applications by using head pose estimation.通过使用头部姿态估计对用于人类辅助应用的未校准和无约束注视跟踪的可行性进行研究。
Sensors (Basel). 2014 May 12;14(5):8363-79. doi: 10.3390/s140508363.
9
Remote gaze tracking system on a large display.大屏幕上的远程凝视跟踪系统。
Sensors (Basel). 2013 Oct 7;13(10):13439-63. doi: 10.3390/s131013439.
10
In the eye of the beholder: a survey of models for eyes and gaze.在观察者的眼中:眼睛和注视模型的调查。
IEEE Trans Pattern Anal Mach Intell. 2010 Mar;32(3):478-500. doi: 10.1109/TPAMI.2009.30.