Gabbard Joseph L, Smith Missie, Tanous Kyle, Kim Hyungil, Jonas Bryan
Grado Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg, VA, United States.
Industrial and Systems Engineering, Oakland University, Rochester, NY, United States.
Front Robot AI. 2019 Oct 23;6:98. doi: 10.3389/frobt.2019.00098. eCollection 2019.
Optical see-through automotive head-up displays (HUDs) are a form of augmented reality (AR) that is quickly gaining penetration into the consumer market. Despite increasing adoption, demand, and competition among manufacturers to deliver higher quality HUDs with increased fields of view, little work has been done to understand how best to design and assess AR HUD user interfaces, and how to quantify their effects on driver behavior, performance, and ultimately safety. This paper reports on a novel, low-cost, immersive driving simulator created using a myriad of custom hardware and software technologies specifically to examine basic and applied research questions related to AR HUDs usage when driving. We describe our experiences developing simulator hardware and software and detail a user study that examines driver performance, visual attention, and preferences using two AR navigation interfaces. Results suggest that conformal AR graphics may not be inherently better than other HUD interfaces. We include lessons learned from our simulator development experiences, results of the user study and conclude with limitations and future work.
Front Robot AI. 2019-10-23
IEEE Trans Vis Comput Graph. 2018-11
IEEE Trans Vis Comput Graph. 2018-4
Micromachines (Basel). 2024-3-26
IEEE Trans Vis Comput Graph. 2018-5-3
Hum Factors. 2014-11
J Safety Res. 2014-4-24
Hum Factors. 2013-6
Int J Comput Assist Radiol Surg. 2012-6-30
Ann N Y Acad Sci. 1995-12-15