A Conceptual Framework for Interoperability of Mobile User Interfaces with Ambient Computing Environments

A Conceptual Framework for Interoperability of Mobile User Interfaces with Ambient Computing Environments

Andreas Lorenz (Fraunhofer RWTH Aachen, Germany)
Copyright: © 2010 |Pages: 16
DOI: 10.4018/jmhci.2010070105
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

The use of mobile and hand-held devices is a desirable option for implementation of user interaction with remote services from a distance, whereby the user should be able to select the input device depending on personal preferences, capabilities and availability of interaction devices. Because of the heterogeneity of available devices and interaction styles, the interoperability needs particular attention by the developer. This paper describes the design of a general solution to enable mobile devices to have control on services at remote hosts. The applied approach enhances the idea of separating the user interface from the application logic, leading to the definition of virtual or logical input devices physically separated from the controlled services.
Article Preview

Introduction

The design and implementation of interaction in ambient computing environments cannot rely on traditional input devices like mouse and keyboard. For remote interaction from a distance, Lorenz et al. (2009) revealed a dramatical increase of the error rate using wireless mouse and keyboard compared to a hand-held device. Whether a control device is suitable for an intended interaction depends on the capabilities, personal preferences, situation and task of the user. If the physical shape of the equipment causes complaints or errors in operation, then the interaction could be improved either by revised design of the input hardware or by freedom to switch to another input device more aligned to the task and the personal attributes of the user.

The opportunity to use mobile devices is a desirable option to enhance interaction with remote services, in particular if the user is experienced in its operation. These shifts in usage of computer technology go hand in hand with re-thinking of user interface technology. Disconnecting the input from the remote service host requires fundamental research in system models and architectures (Olsen, 2007): “Lots of good research into input techniques will never be deployed until better system models are created to unify these techniques for application developers.” The research described by Lorenz et al. (2008) elaborates the fundamental characteristics of a distributed interactive system and derives the technical components for transmission of user input from an input device to remote services.

The main objective of this work is to enable interoperability of distributed user input components on a mobile or hand-held device and service implementation on a device in the current environment of the user. It copes with heterogeneity on multiple levels:

  • 1.

    It enables to interchange input modalities and interaction styles at runtime.

  • 2.

    It elaborates a generic solution going beyond current unspecific patterns for software architectures.

  • 3.

    It delivers software artifacts to cope with incompatible software environments, operating systems, and technology.

  • 4.

    It unifies the software development process, documentation and cooperation between independent persons and development teams.

This paper introduces a framework using virtual input devices to specify the input of the user interface without constraints regarding metaphor, shape, location, or modality. The main requirements to the design of the framework are abstraction, architectural design, and being independent from hard- and software. The specification of the framework identifies the components, defines the relationships between the components and illustrates the data flow within an intended system. The approach enables developers to create interfaces that depend on the meaning of the input rather than on the specific device.

Application Example

The architectural design described in this paper has been used as reference architecture for the implementation of a demonstrator enabling the use of mobile and hand-held devices to wirelessly control multimedia applications from a distance. It implements the use of a remote user interface on the mobile phone of which the user can select from three options to control the media player application:

  • Hardware Buttons Use the hardware buttons of the mobile phone (see Figure 1, left).

  • Software Buttons Activate software buttons by touching the area on the display. The software buttons mirror the controls of the graphical user interface of the application to control (see Figure 1, central image). In a widget approach, the controls could be downloaded from the remote application on demand.

  • Touch-screen/Gestures Tab on the display and drag gestures on the screen of the mobile phone (see Figure 1, right).

Figure 1.

The three interaction methods on the mobile phone: Hardware buttons, software buttons, and gestures on the touch-sensitive display

Complete Article List

Search this Journal:
Reset
Open Access Articles
Volume 10: 4 Issues (2018): 1 Released, 3 Forthcoming
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing