Article Preview
TopIntroduction
According to data from the World Health Organization, there are at least 2.2 billion people, more than a quarter of the world population, suffering from various degrees of visual impairment or blindness (Geneva: World Health Organization, 2019). Among those people, an earlier report shows that there were 285 million people with low vision worldwide and 39 million people were suffering from blindness (Geneva: World Health Organization, 2012). In the US alone, the blind or visually impaired (BVI) population has reached 6.6 million people and is expected to double by 2030 (Varma et al., 2016). As their vision deteriorates, BVI individuals will often rely on a cane or a guide dog to find their way. Although existing technologies, such as GPS, have been leveraged to provide outdoor navigation, there is a need for an assistive technology that aids these individuals in indoor navigation. Indoor navigation requires information that is unavailable to BVI individuals simply due to a lack of visual input.
In the BVI community, the most popular technologies used to meet this need are still long canes and guide dogs (Sato et al., 2019). From our studies and discussions with orientation and mobility professionals, and BVI users themselves, it seems this may be due to a lack of consideration of users’ needs and low availability, or production-readiness, of new and upcoming technologies. We were unable to find any suitable existing commercial products for use in our navigation studies, prompting us to develop and test our own testing system, ASSIST (an acronym for Assistive Sensor Solutions for Independent and Safe Travel) (Nair et al., 2018; Nair et al., 2020). The first prototype of the ASSIST app localizes mobile devices via a hybrid positioning method that utilizes Bluetooth Low Energy (BLE) beacons for coarse localization in conjunction with fine positioning via an augmented reality framework based on Google Tango. However, Tango has been deprecated by Google, which leads to our current work on integrating ARCore on Android and ARKit on iOS for newer prototypes of the ASSIST apps (Chen et al., 2019; Chang et al., 2020).
In this paper, we present iASSIST, an iOS assistive application built with ARKit (Apple: ARKit 2020) that provides turn-by-turn navigation assistance using accurate, real-time localization over large spaces without the installation of expensive infrastructure. This paper is an extension and continuation of the work reported in Chang et al. (2020), with new developments in system architecture, generalized localization and personalized path planning algorithms, and a number of system evaluation designs. The approach can also be easily extended to Android devices, for example, using Google’s ARCore. The mobile client is only one part of iASSIST, which itself is a multimedia information system with the following key components:
- 1.
An iOS-based application that provides turn-by-turn indoor navigation for BVI users with multimedia interaction, including voice interaction, haptic feedback, and visual directions.
- 2.
A client-server architecture for iASSIST hybrid models including information of visual, beacons, connectivity, destinations, landmarks, and other features, which allows scaling to large areas by lazy-loading models using beacon signals and/or adjacent region proximity.
- 3.
A highly accurate and low-cost indoor positioning solution with a generalized localization algorithm to address regional model transition problems faced when large areas must be divided into smaller regions.
- 4.
A graph-based representation that connects local regions with traversable paths between nodes defined as interactively selectable destinations and landmarks, which are either manually designated or automatically extracted along paths during the modeling stage.
- 5.
A personalized route planning algorithm weighted by user preference and hazard potential, with consideration of the Wi-Fi/cellular download speed along the planned path.