The Effects of Augmented Reality Head-Up Displays on Drivers' Eye Scan Patterns, Performance, and Perceptions

The Effects of Augmented Reality Head-Up Displays on Drivers' Eye Scan Patterns, Performance, and Perceptions

Missie Smith (Virginia Tech, Blacksburg, VA, USA), Joseph L. Gabbard (Grado Department of Industrial & Systems Engineering, Virginia Tech, Blacksburg, USA), Gary Burnett (Human Factors Research Group, Faculty of Engineering, University of Nottingham, Nottingham, Nottingham, UK) and Nadejda Doutcheva (Virginia Tech, Blacksburg, VA, USA)
Copyright: © 2017 |Pages: 17
DOI: 10.4018/IJMHCI.2017040101
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

This paper reports on an experiment comparing Head-Up Display (HUD) and Head-Down Display (HDD) use while driving in a simulator to explore differences in glance patterns, driving performance, and user preferences. Sixteen participants completed both structured (text) and semi-structured (grid) visual search tasks on each display while following a lead vehicle in a motorway (highway) environment. Participants experienced three levels of complexity (low, medium, high) for each visual search task, with five repetitions of each level of complexity. Results suggest that the grid task was not sensitive enough to the varying visual demands, while the text task showed significant differences between displays in user preference, perceived workload, and distraction. As complexity increased, HUD use during the text task corresponded with faster performance as compared to the HDD, indicating the potential benefits when using HUDs in the driving context. Furthermore, HUD use was associated with longer sustained glances (at the respective display) as compared to the HDD, with no differences in driving performance observed. This finding suggests that AR HUDs afford longer glances without negatively affecting the longitudinal and lateral control of the vehicle – a result that has implications for how future researchers should evaluate the visual demands for AR HUDs.
Article Preview

Introduction

In 2010 alone, driver distraction due to an in-vehicle device or display/control is estimated to have caused 26,000 crashes in the United States (NHTSA, 2012). While in an ideal world, drivers would be completely focused on the driving task at hand, drivers are instead frequently distracted by secondary tasks (NHTSA, 2012). Driver distraction refers to manual, visual, or cognitive distraction as defined in the NHTSA guidelines (NHTSA, 2012). The driving task is inherently a visual task, as most of the information (e.g. visual flow, speed relative to vehicle ahead, navigation cues, roadway hazards etc.) is visually conveyed to drivers. Therefore, when drivers take their eyes off of the roadway for any reason, they may miss information necessary for safe driving. While in-vehicle displays can provide timely navigation instructions or other driving related information, the task of driving is inherently susceptible to visual distraction when using displays in vehicle. Thus, it is essential to understand how new in-vehicle devices may distract drivers and apply this knowledge to the design of future vehicle-based displays.

Augmented reality (AR) uses a visual display to overlay virtual images onto a person’s view of the real world (Azuma, 1997). AR displays can provide information that is not readily available to the user when simply viewing the surrounding environment. In some cases, AR displays may help remove distractions inherent in viewing information through a separate display because visual information innate to the environment can be gathered simultaneously with relevant visual information provided via the AR display (Azuma, 1997). Broadly speaking, AR interfaces can contain both conformal (registered) graphics, which are perceptually attached to the real world as well as non-conformal (screen fixed) graphics that remain fixed in the screen space but are overlaid atop of the users’ view of the real world nonetheless. Conformal and non-conformal AR images can be conveyed to the user via either head-up displays (HUDs) or head-down displays (HDD). Optical see-through HUDs allow users to potentially continue looking at the road scene while using in-vehicle AR display because they can be positioned very near drivers’ natural line of sight. Conversely, both video-based AR and traditional HDDs require users to look away from their preferred line of sight to gather information. In addition, the focal distance of HDDs require drivers to accommodate to near distances (less than one meter), while HUDs may provide a range of focal distances, generally between two meters and optical infinity depending upon the display hardware design.

HDDs have dominated in-vehicle displays with the center console and dashboard serving as the main operating centers for drivers’ secondary and tertiary tasks. While HUDs have been widely used in aircrafts, recent years have seen renewed interest in implementing HUDs into ground vehicles. In the coming years, it is likely that AR-based HUDs will increasingly become available in commercial vehicles offering a range of driving related functions (e.g., supporting primary, secondary and tertiary tasks). However, the potential effects of AR HUDs on driver distraction and thus driving performance need further exploration.

Complete Article List

Search this Journal:
Reset
Open Access Articles
Volume 10: 4 Issues (2018): 1 Released, 3 Forthcoming
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing