Towards ProGesture, a Tool Supporting Early Prototyping of 3D-Gesture Interaction

Towards ProGesture, a Tool Supporting Early Prototyping of 3D-Gesture Interaction

Birgit Bomsdorf, Rainer Blum, Daniel Künkel
Copyright: © 2015 |Pages: 17
DOI: 10.4018/IJPOP.2015070103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Development of gesture interaction requires a combination of three design matters: gesture, presentation, and dialog. However, in current work on rapid prototyping the focus is on gestures taking into account only the presentation. Model-based development incorporating gestures, in contrast, supports the gesture and dialog dimensions. The work on ProGesture aims at a rapid prototyping tool supporting a coherent development within the whole gesture-presentation-dialog design space. In this contribution, a first version of ProGesture is introduced. Here, gestures are specified by demonstrating the movements or they are composed of other gestures. The tool also provides a dialog editor, which allows gestures to be assigned to dialog models. Based on its executable runtime system the models and gestures can be tested and evaluated. In addition, gestures can be bound to first presentations or existing applications and evaluated in their context.
Article Preview
Top

1. Introduction

3D-gestures, such as touchless hand gestures and body movements are more and more used in human-computer interaction. Although gesture controlled user interfaces have been investigated for several years, developing systematically intuitive and ergonomic 3D-gesture interactions is still challenging. Work in this field does not only aim at appropriate gestures taking into account the physiology of the human body and the users’ goals, but also includes investigation of suitable UI widgets and presentations as a whole, as well as of the development process.

A gesture, according to Hummels & Stappers (1998) and Saffer (2008), is a coordinated movement or position of a body or parts of a body with the intent to interact with a system. This definition comprises dynamic gestures, that are characterized by their movements, and static gestures (also named poses), which are specific postures that are shown for a short while (Mitra & Acharya, 2007). Furthermore, a gesture may be discrete, i.e. the respective reaction of an associated object is triggered after completing the gesture (e.g. a thumb-up gesture to move to the next page of a form). Whereas a continuous gesture provides the user with simultaneous reactions, i.e. the system’s reaction is started and completed with the gesture performance, e.g. “pinch” or “zoom” for map interactions (Ruiz, Li, & Lank 2011). Pointer-based interactions are typical for the continuous category. Nevertheless, from a usability point of view, it may be rewarding, to provide intermediate feedback for all types of gestures, independent of the successful completion of a discrete gesture, e.g. to indicate for the user that the start of a dynamic gesture was recognized.

Besides deciding on gestures, feedback and their relations it has to be ascertained how to communicate the gestures to the user. The presentation, more precisely the perceived affordance highly impacts the gestures users will perform. The concept of affordances was introduced by Norman (1998) and later on clarified as perceived affordances. It describes a desirable property of a UI, which leads users to perform the correct actions to reach their goals. Hence, designing gesture interaction requires not only to develop the gesture set and the presentation but also to consider their dependencies thoroughly. A modification of a gesture set often necessitates altering the presentation, in order to indicate the new gestures to the user.

Additionally, the dialog structure determines the order of gesture execution and therefore affects the design of gesture interaction and vice versa. The dialog may have to be modified, for example, if a sequence of gestures is exhausting the user or if the planned presentation leads the user to perform the gestures in a different sequence.

Three, mutually dependent design dimensions can describe these different perspectives: gesture, presentation and dialog. At the same time, procedures and tools supporting a rapid prototyping of gesture interactions are still under investigation. Currently two approaches exist with respect to these design dimensions: On the one hand, work concentrating on the gesture dimension while taking into account the presentation, and on the other hand work, mostly model-based design approaches, focusing on gestures as a new, possibly additional interaction modality of a dialog. Hence, support for typical development activities is reduced to only two dimensions and their relationships, as yet.

In this paper our work towards ProGesture, a tool supporting rapid Prototyping of 3D-Gesture interactions is introduced. The objective of its development is to cope with the resulting gesture-presentation-dialog design space as a whole in a flexible way. It aims at the early development phases, i.e. at rapid prototyping of 3D-gestures in combination with first UI sketches, such as mockups. In addition, it focuses on dialog modeling and on testing based on executable models. Currently, different main functional modules are implemented as proof-of-concept tools to gain experiences and to refine the requirements. These modules are presented in the following by means of their central features and scenarios of use.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 6: 2 Issues (2017)
Volume 5: 1 Issue (2016)
Volume 4: 2 Issues (2015)
Volume 3: 2 Issues (2014)
Volume 2: 2 Issues (2012)
Volume 1: 2 Issues (2011)
View Complete Journal Contents Listing