• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用基于视觉的分类技术提高夜间驾驶安全性。

Improving Night Time Driving Safety Using Vision-Based Classification Techniques.

作者信息

Chien Jong-Chih, Chen Yong-Sheng, Lee Jiann-Der

机构信息

Degree Program of Digital Space and Product Design, Kainan University, Taoyuan City 338, Taiwan.

Department of Electrical Engineering, Chang-Gung University, Taoyuan City 333, Taiwan.

出版信息

Sensors (Basel). 2017 Sep 24;17(10):2199. doi: 10.3390/s17102199.

DOI:10.3390/s17102199
PMID:28946643
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5677414/
Abstract

The risks involved in nighttime driving include drowsy drivers and dangerous vehicles. Prominent among the more dangerous vehicles around at night are the larger vehicles which are usually moving faster at night on a highway. In addition, the risk level of driving around larger vehicles rises significantly when the driver's attention becomes distracted, even for a short period of time. For the purpose of alerting the driver and elevating his or her safety, in this paper we propose two components for any modern vision-based Advanced Drivers Assistance System (ADAS). These two components work separately for the single purpose of alerting the driver in dangerous situations. The purpose of the first component is to ascertain that the driver would be in a sufficiently wakeful state to receive and process warnings; this is the driver drowsiness detection component. The driver drowsiness detection component uses infrared images of the driver to analyze his eyes' movements using a MSR plus a simple heuristic. This component issues alerts to the driver when the driver's eyes show distraction and are closed for a longer than usual duration. Experimental results show that this component can detect closed eyes with an accuracy of 94.26% on average, which is comparable to previous results using more sophisticated methods. The purpose of the second component is to alert the driver when the driver's vehicle is moving around larger vehicles at dusk or night time. The large vehicle detection component accepts images from a regular video driving recorder as input. A bi-level system of classifiers, which included a novel MSR-enhanced KAZE-base Bag-of-Features classifier, is proposed to avoid false negatives. In both components, we propose an improved version of the Multi-Scale Retinex (MSR) algorithm to augment the contrast of the input. Several experiments were performed to test the effects of the MSR and each classifier, and the results are presented in experimental results section of this paper.

摘要

夜间驾驶所涉及的风险包括困倦的驾驶员和危险车辆。夜间较为危险的车辆中,突出的是大型车辆,它们通常在夜间高速公路上行驶得更快。此外,当驾驶员的注意力分散时,即使是短时间,在大型车辆周围驾驶的风险水平也会显著上升。为了提醒驾驶员并提高其安全性,在本文中,我们为任何现代基于视觉的高级驾驶辅助系统(ADAS)提出了两个组件。这两个组件分别工作,唯一目的是在危险情况下提醒驾驶员。第一个组件的目的是确定驾驶员处于足够清醒的状态以接收和处理警告;这就是驾驶员困倦检测组件。驾驶员困倦检测组件使用驾驶员的红外图像,通过一种MSR加上简单启发式方法来分析其眼睛的运动。当驾驶员的眼睛显示出注意力分散且闭眼时间长于平常时,该组件会向驾驶员发出警报。实验结果表明,该组件平均能以94.26%的准确率检测到闭眼情况,这与使用更复杂方法的先前结果相当。第二个组件的目的是在驾驶员的车辆在黄昏或夜间靠近大型车辆行驶时提醒驾驶员。大型车辆检测组件接受来自常规视频行车记录仪的图像作为输入。提出了一种双层分类器系统,其中包括一个新颖的基于MSR增强的KAZE特征袋分类器,以避免误报。在这两个组件中,我们都提出了多尺度视网膜算法(MSR)的改进版本,以增强输入的对比度。进行了几次实验来测试MSR和每个分类器的效果,结果在本文的实验结果部分给出。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/5741689e946e/sensors-17-02199-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/cd61f48f77f9/sensors-17-02199-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/eda0e5e8e067/sensors-17-02199-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/4eb50a26b693/sensors-17-02199-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/935e599b6cc8/sensors-17-02199-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/aa01209ccf7e/sensors-17-02199-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/cfe318b554b9/sensors-17-02199-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/b34f02cb1a7c/sensors-17-02199-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/cf29a86b8b4d/sensors-17-02199-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/0ed510af560d/sensors-17-02199-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/7abb3f2f7b36/sensors-17-02199-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/058faed367ae/sensors-17-02199-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/5d71b841030b/sensors-17-02199-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/fc1d2e699675/sensors-17-02199-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/b92197672a9f/sensors-17-02199-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/fe56444458c7/sensors-17-02199-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/fa02eb3a7b92/sensors-17-02199-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/9a4f18f9eb75/sensors-17-02199-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/93fad816831f/sensors-17-02199-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/4b29ef032da3/sensors-17-02199-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/5e97e57f5a26/sensors-17-02199-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/826c2d09d03d/sensors-17-02199-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/5741689e946e/sensors-17-02199-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/cd61f48f77f9/sensors-17-02199-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/eda0e5e8e067/sensors-17-02199-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/4eb50a26b693/sensors-17-02199-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/935e599b6cc8/sensors-17-02199-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/aa01209ccf7e/sensors-17-02199-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/cfe318b554b9/sensors-17-02199-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/b34f02cb1a7c/sensors-17-02199-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/cf29a86b8b4d/sensors-17-02199-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/0ed510af560d/sensors-17-02199-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/7abb3f2f7b36/sensors-17-02199-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/058faed367ae/sensors-17-02199-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/5d71b841030b/sensors-17-02199-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/fc1d2e699675/sensors-17-02199-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/b92197672a9f/sensors-17-02199-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/fe56444458c7/sensors-17-02199-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/fa02eb3a7b92/sensors-17-02199-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/9a4f18f9eb75/sensors-17-02199-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/93fad816831f/sensors-17-02199-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/4b29ef032da3/sensors-17-02199-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/5e97e57f5a26/sensors-17-02199-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/826c2d09d03d/sensors-17-02199-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/908a/5677414/5741689e946e/sensors-17-02199-g022.jpg

相似文献

1
Improving Night Time Driving Safety Using Vision-Based Classification Techniques.使用基于视觉的分类技术提高夜间驾驶安全性。
Sensors (Basel). 2017 Sep 24;17(10):2199. doi: 10.3390/s17102199.
2
Prediction of Driver's Intention of Lane Change by Augmenting Sensor Information Using Machine Learning Techniques.利用机器学习技术增强传感器信息预测驾驶员的变道意图
Sensors (Basel). 2017 Jun 10;17(6):1350. doi: 10.3390/s17061350.
3
Are child occupants a significant source of driving distraction?儿童乘客是否是驾驶分心的重要来源?
Accid Anal Prev. 2011 May;43(3):1236-44. doi: 10.1016/j.aap.2011.01.005. Epub 2011 Feb 3.
4
Deep Neuro-Vision Embedded Architecture for Safety Assessment in Perceptive Advanced Driver Assistance Systems: The Pedestrian Tracking System Use-Case.用于感知式高级驾驶辅助系统安全评估的深度神经视觉嵌入式架构:行人跟踪系统用例
Front Neuroinform. 2021 Jul 30;15:667008. doi: 10.3389/fninf.2021.667008. eCollection 2021.
5
Faster R-CNN and Geometric Transformation-Based Detection of Driver's Eyes Using Multiple Near-Infrared Camera Sensors.基于 Faster R-CNN 和几何变换的使用多个近红外相机传感器的驾驶员眼睛检测。
Sensors (Basel). 2019 Jan 7;19(1):197. doi: 10.3390/s19010197.
6
Driver Drowsiness Multi-Method Detection for Vehicles with Autonomous Driving Functions.具有自动驾驶功能车辆的驾驶员困倦多方法检测
Sensors (Basel). 2024 Feb 28;24(5):1541. doi: 10.3390/s24051541.
7
Prediction of drowsiness events in night shift workers during morning driving.预测夜班工人在清晨驾驶时的瞌睡事件。
Accid Anal Prev. 2019 May;126:105-114. doi: 10.1016/j.aap.2017.11.004. Epub 2017 Nov 7.
8
Estimation of Driver's Danger Level when Accessing the Center Console for Safe Driving.驾驶员在安全驾驶时进入中控台的危险程度估计。
Sensors (Basel). 2018 Oct 10;18(10):3392. doi: 10.3390/s18103392.
9
Lightweight Driver Monitoring System Based on Multi-Task Mobilenets.基于多任务 MobileNets 的轻量级驾驶员监控系统。
Sensors (Basel). 2019 Jul 20;19(14):3200. doi: 10.3390/s19143200.
10
A Proactive Recognition System for Detecting Commercial Vehicle Driver's Distracted Behavior.主动识别系统可用于检测商用车驾驶员的分神行为。
Sensors (Basel). 2022 Mar 19;22(6):2373. doi: 10.3390/s22062373.

引用本文的文献

1
Vision-Based On-Road Nighttime Vehicle Detection and Tracking Using Improved HOG Features.基于视觉的改进型HOG特征的道路夜间车辆检测与跟踪
Sensors (Basel). 2024 Feb 29;24(5):1590. doi: 10.3390/s24051590.
2
Low-Light Image Enhancement Using Hybrid Deep-Learning and Mixed-Norm Loss Functions.基于混合深度学习和混合范数损失函数的低光照图像增强
Sensors (Basel). 2022 Sep 13;22(18):6904. doi: 10.3390/s22186904.
3
Gaze and Eye Tracking: Techniques and Applications in ADAS.凝视和眼动追踪:ADAS 中的技术与应用。

本文引用的文献

1
Robust Object Tracking with Online Multiple Instance Learning.基于在线多示例学习的鲁棒目标跟踪。
IEEE Trans Pattern Anal Mach Intell. 2011 Aug;33(8):1619-32. doi: 10.1109/TPAMI.2010.226. Epub 2010 Dec 23.
2
Survey of pedestrian detection for advanced driver assistance systems.高级驾驶员辅助系统中的行人检测综述。
IEEE Trans Pattern Anal Mach Intell. 2010 Jul;32(7):1239-58. doi: 10.1109/TPAMI.2009.122.
Sensors (Basel). 2019 Dec 14;19(24):5540. doi: 10.3390/s19245540.
4
Adaptive Image Rendering Using a Nonlinear Mapping-Function-Based Retinex Model.基于非线性映射函数的反射率模型的自适应图像渲染。
Sensors (Basel). 2019 Feb 25;19(4):969. doi: 10.3390/s19040969.
5
Estimation of Driver's Danger Level when Accessing the Center Console for Safe Driving.驾驶员在安全驾驶时进入中控台的危险程度估计。
Sensors (Basel). 2018 Oct 10;18(10):3392. doi: 10.3390/s18103392.