• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

特斯拉Model X危险驾驶体验中的群体信任动态

Group trust dynamics during a risky driving experience in a Tesla Model X.

作者信息

Momen Ali, de Visser Ewart J, Fraune Marlena R, Madison Anna, Rueben Matthew, Cooley Katrina, Tossell Chad C

机构信息

United States Air Force Academy, Colorado Springs, CO, United States.

Department of Psychology, New Mexico State University, Las Cruces, NM, United States.

出版信息

Front Psychol. 2023 Jun 20;14:1129369. doi: 10.3389/fpsyg.2023.1129369. eCollection 2023.

DOI:10.3389/fpsyg.2023.1129369
PMID:37408965
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10319128/
Abstract

The growing concern about the risk and safety of autonomous vehicles (AVs) has made it vital to understand driver trust and behavior when operating AVs. While research has uncovered human factors and design issues based on individual driver performance, there remains a lack of insight into how trust in automation evolves in groups of people who face risk and uncertainty while traveling in AVs. To this end, we conducted a naturalistic experiment with groups of participants who were encouraged to engage in conversation while riding a Tesla Model X on campus roads. Our methodology was uniquely suited to uncover these issues through naturalistic interaction by groups in the face of a risky driving context. Conversations were analyzed, revealing several themes pertaining to trust in automation: (1) collective risk perception, (2) experimenting with automation, (3) group sense-making, (4) human-automation interaction issues, and (5) benefits of automation. Our findings highlight the untested and experimental nature of AVs and confirm serious concerns about the safety and readiness of this technology for on-road use. The process of determining appropriate trust and reliance in AVs will therefore be essential for drivers and passengers to ensure the safe use of this experimental and continuously changing technology. Revealing insights into social group-vehicle interaction, our results speak to the potential dangers and ethical challenges with AVs as well as provide theoretical insights on group trust processes with advanced technology.

摘要

对自动驾驶汽车(AVs)风险与安全的日益关注,使得了解人们在操作自动驾驶汽车时的信任及行为变得至关重要。虽然已有研究基于个体驾驶员表现揭示了人为因素和设计问题,但对于在乘坐自动驾驶汽车时面临风险和不确定性的人群中,对自动化的信任是如何演变的,仍缺乏深入了解。为此,我们对几组参与者进行了一项自然主义实验,鼓励他们在校园道路上乘坐特斯拉Model X时进行交谈。我们的方法特别适合通过群体在危险驾驶情境下的自然互动来揭示这些问题。对对话进行分析后,揭示了几个与对自动化的信任相关的主题:(1)集体风险感知,(2)对自动化的试验,(3)群体意义建构,(4)人机交互问题,以及(5)自动化的益处。我们的研究结果突出了自动驾驶汽车未经测试和实验的性质,并证实了对该技术用于道路使用的安全性和准备情况的严重担忧。因此,确定对自动驾驶汽车的适当信任和依赖程度的过程,对于驾驶员和乘客确保安全使用这一实验性且不断变化的技术至关重要。我们的研究结果揭示了社会群体与车辆交互的见解,阐述了自动驾驶汽车潜在的危险和伦理挑战,并提供了关于群体对先进技术信任过程的理论见解。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/2dda4763382a/fpsyg-14-1129369-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/16cb9089bb6e/fpsyg-14-1129369-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/08e67519da63/fpsyg-14-1129369-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/a833a60636f0/fpsyg-14-1129369-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/406ba5f15427/fpsyg-14-1129369-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/c1215c2d0893/fpsyg-14-1129369-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/c9bcec87e8dd/fpsyg-14-1129369-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/53e588a82614/fpsyg-14-1129369-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/2dda4763382a/fpsyg-14-1129369-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/16cb9089bb6e/fpsyg-14-1129369-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/08e67519da63/fpsyg-14-1129369-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/a833a60636f0/fpsyg-14-1129369-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/406ba5f15427/fpsyg-14-1129369-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/c1215c2d0893/fpsyg-14-1129369-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/c9bcec87e8dd/fpsyg-14-1129369-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/53e588a82614/fpsyg-14-1129369-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4818/10319128/2dda4763382a/fpsyg-14-1129369-g0008.jpg

相似文献

1
Group trust dynamics during a risky driving experience in a Tesla Model X.特斯拉Model X危险驾驶体验中的群体信任动态
Front Psychol. 2023 Jun 20;14:1129369. doi: 10.3389/fpsyg.2023.1129369. eCollection 2023.
2
Fostering Drivers' Trust in Automated Driving Styles: The Role of Driver Perception of Automated Driving Maneuvers.促进驾驶员对自动驾驶风格的信任:驾驶员对自动驾驶动作感知的作用。
Hum Factors. 2024 Jul;66(7):1961-1976. doi: 10.1177/00187208231189661. Epub 2023 Jul 25.
3
Driving Aggressively or Conservatively? Investigating the Effects of Automated Vehicle Interaction Type and Road Event on Drivers' Trust and Preferred Driving Style.激进驾驶还是保守驾驶?研究自动驾驶车辆交互类型和道路事件对驾驶员信任和偏好驾驶风格的影响。
Hum Factors. 2024 Sep;66(9):2166-2178. doi: 10.1177/00187208231181199. Epub 2023 Jun 9.
4
The Impact of Cybersecurity Attacks on Human Trust in Autonomous Vehicle Operations.网络安全攻击对人类在自动驾驶汽车运营中的信任的影响。
Hum Factors. 2025 May;67(5):485-502. doi: 10.1177/00187208241283321. Epub 2024 Sep 18.
5
Using voice recognition to measure trust during interactions with automated vehicles.利用语音识别技术衡量与自动驾驶车辆交互过程中的信任度。
Appl Ergon. 2024 Apr;116:104184. doi: 10.1016/j.apergo.2023.104184. Epub 2023 Dec 3.
6
Drivers trust, acceptance, and takeover behaviors in fully automated vehicles: Effects of automated driving styles and driver's driving styles.自动驾驶汽车中的驾驶员信任、接受和接管行为:自动驾驶风格和驾驶员驾驶风格的影响。
Accid Anal Prev. 2021 Sep;159:106238. doi: 10.1016/j.aap.2021.106238. Epub 2021 Jun 25.
7
Sharing roads with automated vehicles: A questionnaire investigation from drivers', cyclists' and pedestrians' perspectives.与自动驾驶车辆共享道路:来自驾驶员、骑车者和行人视角的问卷调查。
Accid Anal Prev. 2023 Aug;188:107093. doi: 10.1016/j.aap.2023.107093. Epub 2023 May 5.
8
Perceptions of safety on a shared road: Driving, cycling, or walking near an autonomous vehicle.在共享道路上的安全感知:驾驶、骑行或步行靠近自动驾驶车辆。
J Safety Res. 2020 Feb;72:249-258. doi: 10.1016/j.jsr.2019.12.017. Epub 2020 Jan 14.
9
Communication via motion - Suitability of automated vehicle movements to negotiate the right of way in road bottleneck scenarios.通过动作进行交流——自动驾驶车辆运动在道路瓶颈场景中协商通行权的适用性。
Appl Ergon. 2021 Sep;95:103438. doi: 10.1016/j.apergo.2021.103438. Epub 2021 Apr 23.
10
Challenges to Human Drivers in Increasingly Automated Vehicles.日益自动化的车辆对人类驾驶员的挑战。
Hum Factors. 2020 Mar;62(2):310-328. doi: 10.1177/0018720819900402. Epub 2020 Feb 5.

本文引用的文献

1
Measurement of Trust in Automation: A Narrative Review and Reference Guide.自动化信任度的测量:叙述性综述与参考指南。
Front Psychol. 2021 Oct 19;12:604977. doi: 10.3389/fpsyg.2021.604977. eCollection 2021.
2
Trusting Automation: Designing for Responsivity and Resilience.信任自动化:为响应性和恢复力而设计。
Hum Factors. 2023 Feb;65(1):137-165. doi: 10.1177/00187208211009995. Epub 2021 Apr 27.
3
What's Driving Me? Exploration and Validation of a Hierarchical Personality Model for Trust in Automated Driving.是什么在驱使我?对自动驾驶信任的分层人格模型的探索与验证。
Hum Factors. 2021 Sep;63(6):1076-1105. doi: 10.1177/0018720820922653. Epub 2020 Jul 6.
4
Evolving Trust in Robots: Specification Through Sequential and Comparative Meta-Analyses.机器人信任的演变:通过序列和对比元分析的规范。
Hum Factors. 2021 Nov;63(7):1196-1229. doi: 10.1177/0018720820922080. Epub 2020 Jun 10.
5
Digital Emotion Contagion.数字情感传染。
Trends Cogn Sci. 2020 Apr;24(4):316-328. doi: 10.1016/j.tics.2020.01.009. Epub 2020 Feb 18.
6
Assessing Drivers' Trust of Automated Vehicle Driving Styles With a Two-Part Mixed Model of Intervention Tendency and Magnitude.采用干预倾向和幅度的两部分混合模型评估驾驶员对自动驾驶车辆驾驶风格的信任
Hum Factors. 2021 Mar;63(2):197-209. doi: 10.1177/0018720819880363. Epub 2019 Oct 9.
7
Driving distracted with friends: Effect of passengers and driver distraction on young drivers' behavior.与朋友一起开车时分散注意力:乘客和驾驶员分心对年轻驾驶员行为的影响。
Accid Anal Prev. 2019 Nov;132:105246. doi: 10.1016/j.aap.2019.07.022. Epub 2019 Aug 14.
8
Trust and Distrust of Automated Parking in a Tesla Model X.信任与不信任的特斯拉 Model X 自动泊车。
Hum Factors. 2020 Mar;62(2):194-210. doi: 10.1177/0018720819865412. Epub 2019 Aug 16.
9
The More You Know: Trust Dynamics and Calibration in Highly Automated Driving and the Effects of Take-Overs, System Malfunction, and System Transparency.知之愈多:高度自动化驾驶中的信任动态和校准,以及接管、系统故障和系统透明度的影响。
Hum Factors. 2020 Aug;62(5):718-736. doi: 10.1177/0018720819853686. Epub 2019 Jun 24.
10
Pitfalls of automation: a faulty narrative?自动化的陷阱:一种错误的说法?
Ergonomics. 2019 Apr;62(4):505-508. doi: 10.1080/00140139.2019.1563334. Epub 2019 Apr 7.