Follow the Leader: Examining Real and Augmented Reality Lead Vehicles as Driving Navigational Aids

Follow the Leader: Examining Real and Augmented Reality Lead Vehicles as Driving Navigational Aids

Bethan Hannah Topliss (The University of Nottingham, Nottingham, UK), Sanna M. Pampel (University of Nottingham, Nottingham, UK), Gary Burnett (Human Factors Research Group, Faculty of Engineering, University of Nottingham, Nottingham, UK), Lee Skrypchuk (Jaguar Land Rover Research, Coventry, UK) and Chrisminder Hare (Jaguar Land Rover Research, Coventry, UK)
Copyright: © 2019 |Pages: 20
DOI: 10.4018/IJMHCI.2019040102

Abstract

Two studies investigated the concept of following a lead vehicle as a navigational aid. The first video-based study (n=34) considered how drivers might use a real-world lead vehicle as a navigational aid, whist the second simulator-based study (n=22) explored how an Augmented Reality (AR) virtual car, presented on a head-up display (HUD), may aid navigation around a complex junction. Study 1 indicated that a lead vehicle is most valued as a navigation aid just before/during a required maneuver. During the second study the dynamic virtual car (which behaved like a real vehicle) resulted in greater confidence and lower workload than a static virtual car that “waits” at the correct junction exit, but resulted in more gaze concentration. It is concluded that a virtual car may be a valuable element of a navigation system, in combination with other forms of information, to completely fulfil all a driver's navigational task requirements.
Article Preview
Top

Introduction

The rapid development of head-up displays (HUDs) is reducing the limitations on how navigational aids may function within vehicles. At present, information can be layered over the driver’s view of the road environment (Gabbard et al., 2014), potentially reducing the need to look away from the road scene for gathering display information (cf. Victor, 2005). Hence, augmentation of the road environment poses a tempting opportunity to better provide the driver with information, as many have started to investigate (e.g. Tonnis, Sandor, Klinker, Lange, & Bubb, 2005). Currently, novel augmented reality (AR) HUD concepts are highlighting hazards in real time to encourage the driver’s attention to safety critical information (Park, Park, Won, Kim, & Jung, 2013). Others are aiding navigation by highlighting relevant road signs (Chu, Brewer, & Joseph, 2008) or superimposing paper airplanes on to the road environment, which act as arrows to indicate a direction (Bark, Tran, Fujimura, & Ng-Thow-Hing, 2014). A study investigating AR navigation systems highlighting relevant landmarks found that these landmark cues required less visual attention than conventional cues (Bolton et al., 2015). The present work investigates a novel approach to aiding navigation using an AR HUD.

Using a ‘front’ vehicle as a navigational aid may be considered a broadly familiar experience: A driver who is aware of a route may lead another, unaware driver in a separate vehicle who follows behind. Although work has examined car-following behaviours extensively from the perspective of general traffic behaviours, with consideration of driver behaviours (Ranney, 1999), minimal research has actually investigated car following for navigational purposes (McNabb, Kuzel, & Gray, 2017).

This work aims to clarify how drivers use a lead vehicle as a navigational aid in this manner, how this lead vehicle affects visual behaviour (eye-movement) since driving is a predominantly visual task (Foley, 2009), and then examine how an AR version of this concept may perform within a specific navigational example.

Complete Article List

Search this Journal:
Reset
Open Access Articles
Volume 12: 4 Issues (2020): 1 Released, 3 Forthcoming
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing