iASSIST: An iPhone-Based Multimedia Information System for Indoor Assistive Navigation

iASSIST: An iPhone-Based Multimedia Information System for Indoor Assistive Navigation

Zhigang Zhu, Jin Chen, Lei Zhang, Yaohua Chang, Tyler Franklin, Hao Tang, Arber Ruci
DOI: 10.4018/IJMDEM.2020100103
(Individual Articles)
No Current Special Offers


The iASSIST is an iPhone-based assistive sensor solution for independent and safe travel for people who are blind or visually impaired, or those who simply face challenges in navigating an unfamiliar indoor environment. The solution integrates information of Bluetooth beacons, data connectivity, visual models, and user preferences. Hybrid models of interiors are created in a modeling stage with these multimodal data, collected, and mapped to the floor plan as the modeler walks through the building. Client-server architecture allows scaling to large areas by lazy-loading models according to beacon signals and/or adjacent region proximity. During the navigation stage, a user with the navigation app is localized within the floor plan, using visual, connectivity, and user preference data, along an optimal route to their destination. User interfaces for both modeling and navigation use multimedia channels, including visual, audio, and haptic feedback for targeted users. The design of human subject test experiments is also described, in addition to some preliminary experimental results.
Article Preview


According to data from the World Health Organization, there are at least 2.2 billion people, more than a quarter of the world population, suffering from various degrees of visual impairment or blindness (Geneva: World Health Organization, 2019). Among those people, an earlier report shows that there were 285 million people with low vision worldwide and 39 million people were suffering from blindness (Geneva: World Health Organization, 2012). In the US alone, the blind or visually impaired (BVI) population has reached 6.6 million people and is expected to double by 2030 (Varma et al., 2016). As their vision deteriorates, BVI individuals will often rely on a cane or a guide dog to find their way. Although existing technologies, such as GPS, have been leveraged to provide outdoor navigation, there is a need for an assistive technology that aids these individuals in indoor navigation. Indoor navigation requires information that is unavailable to BVI individuals simply due to a lack of visual input.

In the BVI community, the most popular technologies used to meet this need are still long canes and guide dogs (Sato et al., 2019). From our studies and discussions with orientation and mobility professionals, and BVI users themselves, it seems this may be due to a lack of consideration of users’ needs and low availability, or production-readiness, of new and upcoming technologies. We were unable to find any suitable existing commercial products for use in our navigation studies, prompting us to develop and test our own testing system, ASSIST (an acronym for Assistive Sensor Solutions for Independent and Safe Travel) (Nair et al., 2018; Nair et al., 2020). The first prototype of the ASSIST app localizes mobile devices via a hybrid positioning method that utilizes Bluetooth Low Energy (BLE) beacons for coarse localization in conjunction with fine positioning via an augmented reality framework based on Google Tango. However, Tango has been deprecated by Google, which leads to our current work on integrating ARCore on Android and ARKit on iOS for newer prototypes of the ASSIST apps (Chen et al., 2019; Chang et al., 2020).

In this paper, we present iASSIST, an iOS assistive application built with ARKit (Apple: ARKit 2020) that provides turn-by-turn navigation assistance using accurate, real-time localization over large spaces without the installation of expensive infrastructure. This paper is an extension and continuation of the work reported in Chang et al. (2020), with new developments in system architecture, generalized localization and personalized path planning algorithms, and a number of system evaluation designs. The approach can also be easily extended to Android devices, for example, using Google’s ARCore. The mobile client is only one part of iASSIST, which itself is a multimedia information system with the following key components:

  • 1.

    An iOS-based application that provides turn-by-turn indoor navigation for BVI users with multimedia interaction, including voice interaction, haptic feedback, and visual directions.

  • 2.

    A client-server architecture for iASSIST hybrid models including information of visual, beacons, connectivity, destinations, landmarks, and other features, which allows scaling to large areas by lazy-loading models using beacon signals and/or adjacent region proximity.

  • 3.

    A highly accurate and low-cost indoor positioning solution with a generalized localization algorithm to address regional model transition problems faced when large areas must be divided into smaller regions.

  • 4.

    A graph-based representation that connects local regions with traversable paths between nodes defined as interactively selectable destinations and landmarks, which are either manually designated or automatically extracted along paths during the modeling stage.

  • 5.

    A personalized route planning algorithm weighted by user preference and hazard potential, with consideration of the Wi-Fi/cellular download speed along the planned path.

Complete Article List

Search this Journal:
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing