Crossmodal Audio and Tactile Interaction with Mobile Touchscreens

Crossmodal Audio and Tactile Interaction with Mobile Touchscreens

Eve Hoggan
Copyright: © 2010 |Pages: 16
DOI: 10.4018/jmhci.2010100102
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This article asserts that using crossmodal auditory and tactile interaction can aid mobile touchscreen users in accessing data non-visually and, by providing a choice of modalities, can help to overcome problems that occur in different mobile situations where one modality may be less suitable than another (Hoggan, 2010). By encoding data using the crossmodal parameters of audio and vibration, users can learn mappings and translate information between both modalities. In this regard, data may be presented to the most appropriate modality given the situation and surrounding environment.
Article Preview
Top

Introduction

As mobile touchscreen technology has become more widespread, there have been many technological advances but one key feature remains the same; touchscreen mobile devices often use cut-down versions of desktop user interfaces placing high demands on the visual sense that may prove awkward in mobile settings. Mobile device users could be considered to be situationally impaired because it is possible to lose the use of different senses temporarily at different times, locations and other situational contexts. Currently, many available devices support audio and tactile feedback for simple alerts such as incoming call notifications through the use of standard built-in vibrotactile actuators and audio speakers. These may be leveraged to provide user feedback to different sensory modalities when the visual sense is overloaded or unavailable.

Mobile phones are personal devices, always on and always with us, which means that whether it is in our bag or pocket, or we are in a meeting, at a party, or listening to music, we still want to be able to interact with our device. In these situations, visual feedback is not always appropriate. Although a user’s eyes may be busy focusing on the primary task, many activities do not otherwise restrict users from attending to information using their remaining available senses. This is when multimodal interaction is of benefit so that, for instance, messages can be presented through the audio modality and warnings can be presented through the tactile. Unfortunately, these modalities can also be inappropriate at times. For example consider this typical usage scenario: Sam is walking to a meeting with her mobile phone in her bag when she receives an important calendar reminder. As her phone is not in contact with her body, a tactile alert would probably go unnoticed so the reminder would be best presented in audio. Next, Sam boards a train to continue her journey and downloads some music for her phone. Given that the train is noisy and the phone is in her pocket, audio alerts alone would be insufficient to inform her of her completed download. At the same time, tactile alerts would be slightly masked, as the phone is not in direct contact with her skin. At this time, a combination of audio and tactile feedback could let her know when her song has been downloaded. Finally, Sam arrives at her meeting and receives an urgent email from her husband. It would be rude for Sam to disrupt the meeting with audio feedback informing her of the incoming email. In this case, a tactile cue would be much more subtle and socially acceptable. This scenario is an example of the need for mobile devices to provide alternative presentation modalities through which information may be presented if the context requires.

As mentioned, multimodal feedback is often used to reduce the visual load on mobile device users. The possibilities of communicating information and enhancing interaction through senses other than vision, e.g., sound and touch, has generated a rich body of research (Brewster, 2002; Cockburn & Brewster, 2005; Fukumoto & Sugimura, 2001; Gaver, 1987; Hall, Hoggan, & Brewster, 2008; Kaaresoja, Brown, & Linjama, 2006; Lee & Zhai, 2009; Mereu & Kazman, 1996; Poupyrev & Maruyama, 2003). The existing research has demonstrated that audio and tactile feedback can be beneficial to mobile touchscreen users, increasing typing speeds and reducing errors with some training. However, as the scenario above has demonstrated, users need to be able to switch effortlessly between different modalities depending on the situation. Users also need the option of several different modalities. Much of the research so far does not give the user a choice of modalities but simply provides one output modality, resulting in unimodal interaction.

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 15: 1 Issue (2023)
Volume 14: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 13: 1 Issue (2021)
Volume 12: 3 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing