Novel Technologies and Interaction Paradigms in Mobile HCI

Novel Technologies and Interaction Paradigms in Mobile HCI

Gitte Lindgaard, Sheila Narasimhan
DOI: 10.4018/978-1-60960-499-8.ch019
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In this chapter the authors argue that it is time for the mobile HCI community to think beyond the traditional screen-keyboard-mouse paradigm and explore the many possibilities that mobility, mobile platforms, and people on the move offer. They present a collection of ideas aiming to encourage HCI researchers to explore how up-and-coming mobile technologies can inspire new interaction models, alternative I/O methods, and data collection methods. In particular, they discuss potential applications for gesture- as well as sound-based technologies. The range of possible applications designed to make life easier for specified user populations is limited, they maintain, only by their imagination to understand novel problem spaces, to mix, match and expand on existing methods as well as to invent, test, and validate new methods.
Chapter Preview
Top

Introduction

The continuing emergence of new technologies and creative, innovative ways of employing existing mobile tools for new purposes and a wide range of audiences highlights a need for understanding how to maximize the potential benefits these may offer. Indeed, the need to explore different approaches and techniques for generating user requirements and evaluating mobile and wearable devices has been recognized for some time (e.g. Lumsden & Brewster, 2003; Benyon, Höök & Nigay, 2010; Lindgaard & Narasimhan, 2009). Progress is being made in many areas of human endeavor to embrace new technologies especially in the mobile arena. The recent mobile HCI literature shows how such new technologies could be deployed in a variety of application domains; it also demonstrates how different technologies may be combined and how existing technologies may be adapted to suit mobile devices and people. In this chapter, we discuss mainly technologies that rely on sensory modalities other than, or in addition to, human vision, and those that go beyond the traditional keyboard-screen paradigm. We explore the techniques currently applied in user research and show wherever suitable how evolving mobile products reported are evaluated. In many recent papers, innovative technological approaches and solutions to known user-related problems have only been considered or proposed, with no user-based studies reported as yet. One may regard these ideas as novel approaches to requirements gathering where technological and human sensory and motor capabilities and boundaries are explored ‘in the wild’. That is, in some cases it is still unclear how the evolving technologies discussed may be useful in future applications. In other papers, small pilot studies with a limited number of participants have been published. Since some of the technologies are still at a very early stage of development, it is not surprising that few studies report results of fully fledged user-based studies. We include all kinds of studies here, as we find the ideas behind proposed or preliminary prototype applications often very inspiring and thought-provoking, allowing us to speculate on potential future mobile applications.

The next section discusses a range of recently published gesture-based approaches employed in mobile devices. It presents how some techniques invented in the gaming industry has been, and could be, used in other areas such as in medicine, particularly in rehabilitation. We review two studies concerning wearable gesture-based technologies that can be used with ‘eyes busy’, and one study that frees up mobile touch screen real estate by using the casing itself as an interaction mechanism. We take the risk of going out on a limb by imagining how some of the ideas underlying these novel approaches could be applied to other domains. It is followed by a discussion of some context- and location-aware technologies that, although many issues are still to be overcome, suggest some innovative applications. Next, we review some of the literature on sound-based interaction, both non-vocal sounds and speech. Interesting progress is being made in terms of substituting visual information such as graphs and geographical maps with sound as well as sound being used to help blind and visually impaired people navigate their environment. One recent application using natural language to present information about graphs allowing users to interrogate the graph content is described in some detail. We then offer some thoughts on the challenges of collecting data in usability evaluations of mobile devices and conclude that, although we agree with other authors that new evaluation paradigms are needed in mobile HCI, the HCI community must be very careful not to deny both laboratory-based and field-based usability studies their rightful place in the mobile domain.

Complete Chapter List

Search this Book:
Reset