• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

驾驶员在自动驾驶过程中使用主动注视来监控航点。

Drivers use active gaze to monitor waypoints during automated driving.

机构信息

School of Psychology, University of Leeds, Leeds, UK.

Cognitive Science, University of Helsinki, Helsinki, Finland.

出版信息

Sci Rep. 2021 Jan 8;11(1):263. doi: 10.1038/s41598-020-80126-2.

DOI:10.1038/s41598-020-80126-2
PMID:33420150
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7794576/
Abstract

Automated vehicles (AVs) will change the role of the driver, from actively controlling the vehicle to primarily monitoring it. Removing the driver from the control loop could fundamentally change the way that drivers sample visual information from the scene, and in particular, alter the gaze patterns generated when under AV control. To better understand how automation affects gaze patterns this experiment used tightly controlled experimental conditions with a series of transitions from 'Manual' control to 'Automated' vehicle control. Automated trials were produced using either a 'Replay' of the driver's own steering trajectories or standard 'Stock' trials that were identical for all participants. Gaze patterns produced during Manual and Automated conditions were recorded and compared. Overall the gaze patterns across conditions were very similar, but detailed analysis shows that drivers looked slightly further ahead (increased gaze time headway) during Automation with only small differences between Stock and Replay trials. A novel mixture modelling method decomposed gaze patterns into two distinct categories and revealed that the gaze time headway increased during Automation. Further analyses revealed that while there was a general shift to look further ahead (and fixate the bend entry earlier) when under automated vehicle control, similar waypoint-tracking gaze patterns were produced during Manual driving and Automation. The consistency of gaze patterns across driving modes suggests that active-gaze models (developed for manual driving) might be useful for monitoring driver engagement during Automated driving, with deviations in gaze behaviour from what would be expected during manual control potentially indicating that a driver is not closely monitoring the automated system.

摘要

自动驾驶汽车(AVs)将改变驾驶员的角色,从主动控制车辆转变为主要监控车辆。将驾驶员从控制回路中移除可能会从根本上改变驾驶员从场景中采样视觉信息的方式,特别是在自动驾驶控制下改变生成的注视模式。为了更好地了解自动化如何影响注视模式,本实验在严格控制的实验条件下进行了一系列从“手动”控制到“自动”车辆控制的转换。自动化试验是通过驾驶员自己的转向轨迹的“回放”或所有参与者相同的标准“库存”试验来产生的。记录并比较了在手动和自动条件下产生的注视模式。总体而言,各条件下的注视模式非常相似,但详细分析表明,驾驶员在自动化时稍微向前看(增加注视时间间隔),而库存和回放试验之间只有很小的差异。一种新的混合建模方法将注视模式分解为两个不同的类别,结果表明,在自动化时,注视时间间隔增加。进一步的分析表明,虽然当处于自动驾驶车辆控制下时,一般会向前看(更早地注视弯道入口),但在手动驾驶和自动驾驶期间会产生类似的航点跟踪注视模式。不同驾驶模式下注视模式的一致性表明,主动注视模型(为手动驾驶开发)可能有助于监测自动驾驶期间的驾驶员参与度,注视行为与手动控制时的预期不符可能表明驾驶员没有密切监控自动化系统。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/2931aa556782/41598_2020_80126_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/ebc9bb6e49ca/41598_2020_80126_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/ee59d6e69ba9/41598_2020_80126_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/985f37c07e52/41598_2020_80126_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/11090e55100d/41598_2020_80126_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/406da64a6665/41598_2020_80126_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/66018ac4f428/41598_2020_80126_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/8d1bae8e47ea/41598_2020_80126_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/f6b0f71f3ee7/41598_2020_80126_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/67208ac34346/41598_2020_80126_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/1adcf539dd5c/41598_2020_80126_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/42ec835bcc0e/41598_2020_80126_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/2931aa556782/41598_2020_80126_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/ebc9bb6e49ca/41598_2020_80126_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/ee59d6e69ba9/41598_2020_80126_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/985f37c07e52/41598_2020_80126_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/11090e55100d/41598_2020_80126_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/406da64a6665/41598_2020_80126_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/66018ac4f428/41598_2020_80126_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/8d1bae8e47ea/41598_2020_80126_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/f6b0f71f3ee7/41598_2020_80126_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/67208ac34346/41598_2020_80126_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/1adcf539dd5c/41598_2020_80126_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/42ec835bcc0e/41598_2020_80126_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/796b/7794576/2931aa556782/41598_2020_80126_Fig12_HTML.jpg

相似文献

1
Drivers use active gaze to monitor waypoints during automated driving.驾驶员在自动驾驶过程中使用主动注视来监控航点。
Sci Rep. 2021 Jan 8;11(1):263. doi: 10.1038/s41598-020-80126-2.
2
Looking at the Road When Driving Around Bends: Influence of Vehicle Automation and Speed.转弯驾驶时看路:车辆自动化和速度的影响。
Front Psychol. 2019 Aug 8;10:1699. doi: 10.3389/fpsyg.2019.01699. eCollection 2019.
3
Model-based estimation of the state of vehicle automation as derived from the driver's spontaneous visual strategies.基于驾驶员自发视觉策略推导的车辆自动化状态的模型估计。
J Eye Mov Res. 2021 Feb 9;12(3). doi: 10.16910/jemr.12.3.10.
4
From partial and high automation to manual driving: Relationship between non-driving related tasks, drowsiness and take-over performance.从部分自动化到高度自动化再到手动驾驶:非驾驶相关任务、困意与接管性能之间的关系。
Accid Anal Prev. 2018 Dec;121:28-42. doi: 10.1016/j.aap.2018.08.018. Epub 2018 Sep 8.
5
The effect of motor control requirements on drivers' eye-gaze pattern during automated driving.自动驾驶中对运动控制要求对驾驶员眼动模式的影响。
Accid Anal Prev. 2020 Dec;148:105788. doi: 10.1016/j.aap.2020.105788. Epub 2020 Oct 8.
6
Getting Back Into the Loop: The Perceptual-Motor Determinants of Successful Transitions out of Automated Driving.重新进入循环:成功脱离自动驾驶的感知运动决定因素。
Hum Factors. 2019 Nov;61(7):1037-1065. doi: 10.1177/0018720819829594. Epub 2019 Mar 6.
7
Drivers trust, acceptance, and takeover behaviors in fully automated vehicles: Effects of automated driving styles and driver's driving styles.自动驾驶汽车中的驾驶员信任、接受和接管行为:自动驾驶风格和驾驶员驾驶风格的影响。
Accid Anal Prev. 2021 Sep;159:106238. doi: 10.1016/j.aap.2021.106238. Epub 2021 Jun 25.
8
Estimating the out-of-the-loop phenomenon from visual strategies during highly automated driving.从高度自动化驾驶中的视觉策略估计脱离情境现象。
Accid Anal Prev. 2020 Dec;148:105776. doi: 10.1016/j.aap.2020.105776. Epub 2020 Oct 8.
9
Moving Into the Loop: An Investigation of Drivers' Steering Behavior in Highly Automated Vehicles.进入循环:对高度自动驾驶车辆中驾驶员转向行为的调查
Hum Factors. 2020 Jun;62(4):671-683. doi: 10.1177/0018720819850283. Epub 2019 Jun 10.
10
The effect of information from dash-based human-machine interfaces on drivers' gaze patterns and lane-change manoeuvres after conditionally automated driving.基于驾驶仪表盘的人机界面信息对条件自动驾驶后驾驶员注视模式和变道操作的影响。
Accid Anal Prev. 2022 Sep;174:106726. doi: 10.1016/j.aap.2022.106726. Epub 2022 Jun 16.

引用本文的文献

1
Monocular blur impairs heading judgements from optic flow.单眼模糊会损害基于光流的航向判断。
Iperception. 2025 Feb 26;16(1):20416695251317148. doi: 10.1177/20416695251317148. eCollection 2025 Jan-Feb.
2
Decreased Visual Search Behavior in Elderly Drivers during the Early Phase of Reverse Parking, But an Increase during the Late Phase.老年人在倒车入库的早期阶段视觉搜索行为减少,但在后期阶段增加。
Sensors (Basel). 2023 Dec 1;23(23):9555. doi: 10.3390/s23239555.
3
Can gaze control steering?目光能否控制转向?

本文引用的文献

1
Humans use Optokinetic Eye Movements to Track Waypoints for Steering.人类使用视动眼动追踪转向的航点。
Sci Rep. 2020 Mar 6;10(1):4175. doi: 10.1038/s41598-020-60531-3.
2
Looking at the Road When Driving Around Bends: Influence of Vehicle Automation and Speed.转弯驾驶时看路:车辆自动化和速度的影响。
Front Psychol. 2019 Aug 8;10:1699. doi: 10.3389/fpsyg.2019.01699. eCollection 2019.
3
Humans Use Predictive Gaze Strategies to Target Waypoints for Steering.人类使用预测性注视策略来瞄准转向的目标点。
J Vis. 2023 Jul 3;23(7):12. doi: 10.1167/jov.23.7.12.
4
Noise estimation for head-mounted 3D binocular eye tracking using Pupil Core eye-tracking goggles.使用瞳孔核心眼动追踪护目镜对头戴式 3D 双目眼动追踪进行噪声估计。
Behav Res Methods. 2024 Jan;56(1):53-79. doi: 10.3758/s13428-023-02150-0. Epub 2023 Jun 27.
Sci Rep. 2019 Jun 6;9(1):8344. doi: 10.1038/s41598-019-44723-0.
4
Getting Back Into the Loop: The Perceptual-Motor Determinants of Successful Transitions out of Automated Driving.重新进入循环:成功脱离自动驾驶的感知运动决定因素。
Hum Factors. 2019 Nov;61(7):1037-1065. doi: 10.1177/0018720819829594. Epub 2019 Mar 6.
5
A computational model for driver's cognitive state, visual perception and intermittent attention in a distracted car following task.分心跟车任务中驾驶员认知状态、视觉感知和间歇性注意力的计算模型。
R Soc Open Sci. 2018 Sep 5;5(9):180194. doi: 10.1098/rsos.180194. eCollection 2018 Sep.
6
A top-down saliency model with goal relevance.一种具有目标相关性的自上而下的显著性模型。
J Vis. 2019 Jan 2;19(1):11. doi: 10.1167/19.1.11.
7
Gaze doesn't always lead steering.注视并不总能引导转向。
Accid Anal Prev. 2018 Dec;121:268-278. doi: 10.1016/j.aap.2018.09.026. Epub 2018 Oct 4.
8
Visuomotor control, eye movements, and steering: A unified approach for incorporating feedback, feedforward, and internal models.视动控制、眼球运动和转向:反馈、前馈和内模综合的统一方法。
Psychol Bull. 2018 Oct;144(10):981-1001. doi: 10.1037/bul0000150. Epub 2018 Jun 11.
9
Sustained sensorimotor control as intermittent decisions about prediction errors: computational framework and application to ground vehicle steering.作为关于预测误差的间歇性决策的持续感觉运动控制:计算框架及其在地面车辆转向中的应用
Biol Cybern. 2018 Jun;112(3):181-207. doi: 10.1007/s00422-017-0743-9. Epub 2018 Feb 16.
10
A new and general approach to signal denoising and eye movement classification based on segmented linear regression.基于分段线性回归的信号去噪和眼动分类新方法
Sci Rep. 2017 Dec 18;7(1):17726. doi: 10.1038/s41598-017-17983-x.