A Model-Based Approach to Analysis and Calibration of Sensor-Based Human Interaction Loops

A Model-Based Approach to Analysis and Calibration of Sensor-Based Human Interaction Loops

Parisa Eslambolchilar (Swansea University, UK) and Roderick Murray-Smith (Glasgow University, UK)
DOI: 10.4018/978-1-4666-0194-9.ch003
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

The dynamic systems approach to the design of continuous interaction allows designers to use analytical tools such as state-space modeling and Bode diagrams to simulate and analyse the behaviour and stability of sensor-based applications alone and when it is coupled with a manual control model of user behaviour. This approach also helps designers to calibrate and tune the parameters of the sensor-based application before the actual implementation, and in response to user action. In this article the authors introduce some term definitions from manual control theory for the analysis of the continuous aspects of the interaction design and human behaviour. Then we provide a theoretical framework for specification, analysis and calibration of a sensor-based zooming and scrolling application on mobile devices including the user in the interaction loop. It is especially topical and interesting for guiding design of sensor-based applications on mobile devices. We test our framework with a tilt-controlled speed-dependent automatic zooming application on a PDA.
Chapter Preview
Top

1 Introduction

What distinguishes interactive systems from other classes of computing systems is the user, and the general focus of research in interactive systems has been the need to accommodate the user, and specifically the “usability” of the system. One area of research within this has been concerned with the development of models of interactive systems, and sometimes of the user, in order to analyse the behaviour of the user and the system (Jagacinski & Flach, 2003).

With the increasing popularity of mobile phones and in general handheld devices in recent years, more and more computers are being used in a mobile environment. Millions of people who use mobile phones carry them everywhere in their hand, pocket and bag. For many of us these devices are not perceived as computers, but rather as augmented elements of the physical environment (Streitz, 2001). Therefore, interaction shifts from an explicit paradigm, in which the user’s attention is on computing, towards an implicit paradigm, in which interfaces themselves drive human attention when required (Streitz, 2000).

Nowadays interaction with handheld devices is not limited to using the keyboard or touch-screen and, traditional interaction design methods based on WIMP (Windows-Icon-Menu-Pointer). These devices are now able to accept input and provide output via other means than WIMP. As topical examples, on the iPhone and Nokia N-series, the user can rotate the screen view from landscape to portrait and vice versa by rotating the device. Other means of interaction with mobile devices are gesture input and audio/haptic output. These can facilitate one-handed control, which requires less visual attention than two-handed touch-screen control, and that multimodality in the interaction can compensate for the lack of screen space (Dong, Watters, & Duffy, 2005; Fallman, 2002; Hinckley, Pierce, Horvitz, & Sinclair, 2005; Oakley, Ängeslevä, Hughes, & O'Modhrain, 2004; Rekimoto, 1996; Wigdor &Balakrishnan, 2003). In such novel interaction techniques, i.e., gesture recognition, audio/haptic feedback, continuous interaction is at the heart of the interaction between the human and sensor-based application; because the human is tightly coupled to the application via interaction over a period of time and exchanges continuous input/output of dynamic information at a relatively high speed with the application, this cannot be modeled as a series of discrete events and static models (Doherty, & Massink, 1999).

The main contribution of this work is to develop a theoretical framework for specification, analysis and calibration of sensor-based applications on mobile devices without excluding the user from the interaction loop. It is especially topical and interesting for guiding design of sensor-based applications on mobile devices. The issue was motivated by analysing an interaction technique called speed-dependent automatic zooming (SDAZ) (Igarashi & Hinckely, 2000), which in previous research has been found to outperform manual zooming approaches on desktop computers (Cockburn & Savage, 2003, 2005). SDAZ unifies rate-based scrolling and zooming to overcome the restrictions in screen space for browsing images and texts. The user controls the scrolling speed only, and the SDAZ application automatically adjusts the zoom level so that the speed of visual flow across the screen remains constant. Using this technique, the user can smoothly locate a distant target in a large document without having to manually interweave zooming and scrolling, and without becoming disoriented by extreme visual flow.

Complete Chapter List

Search this Book:
Reset