Physicality in Technological Interface Design

Physicality in Technological Interface Design

Andrew J. Wodehouse, Jonathon Marks
Copyright: © 2015 |Pages: 21
DOI: 10.4018/978-1-4666-8679-3.ch019
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

This research explores emotional response to gesture in order to inform future product interaction design. After describing the emergence and likely role of full-body interfaces with devices and systems, the importance of emotional reaction to the necessary movements and gestures is outlined. A gestural vocabulary for the control of a web page is then presented, along with a semantic differential questionnaire for its evaluation. An experiment is described where users undertook a series of web navigation tasks using the gestural vocabulary, then recorded their reaction to the experience. A number of insights were drawn on the context, precision, distinction, repetition and scale of gestures when used to control or activate a product. These insights will be of help in interaction design, and provide a basis for further development of gestural vocabularies.
Chapter Preview
Top

Introduction

As technology becomes increasingly sophisticated, consumers expect more powerful and natural user interfaces than has previously been the case (Shan, 2010). While User-Centered Design (UCD) ensures that the task-orientated needs of users are recognized, the increasing adoption of Human-Centered Design (HCD) and User Experience (UX) has recognized the broader need for our interactions with technology to be “physically, perceptually, cognitively and emotionally intuitive” (Giacomin, 2014). As products become increasingly “dematerialised” (Dunne, 2008) through the use of electronics, physical operation has in many cases been replaced by control through software – for example, televisions, vending machines, and smartphones are experienced primarily as an interface rather than a physical entity. Despite the emergence of UCD, HCD and UX, the complexity of many control systems mean that the experience of using too many contemporary products is unrewarding and in the worst cases emotionally upsetting (Moggridge, 2007; Norman, 2004). This is perhaps less surprising when viewed from an evolutionary perspective: for two million years humans have interacted with their environment through physical manipulation. From the earliest stone tools, our physiology has adapted and improved to provide us with the motor skills to perform operations of great complexity (Lancaster, 1968; Susman, 1998) and has long been discussed as a key factor in the development of human intellectual capacity (Skoyles, 1999; Stout & Chaminade, 2007). These innate characteristics make physical movement attractive in the control of products (Costello & Edmonds, 2007) and is likely to be important in the era of ubiquitous or pervasive computing (Abawajy, 2009; Hassenzahl, 2013).

This work therefore explores how we can balance and extend computer interaction to make better use of the human body. While Gesture Controlled User Interfaces (GCUIs) have been around for the last 30 years (Bhuiyan & Picking, 2011; Buxton, 2012), recent developments in motion detection and analysis have made the hardware and software more widely available for researchers. This has resulted in an increase in attention to the applications and possibilities of such technology beyond its original use in gaming. For example, Kuhnel et al (2011) have conducted studies on the use of three dimensional gestures using a mobile phone to control a smart home environment. This utilizes the motion sensors in the phone to detect basic swipes, tilts and points to control various devices. In revisiting the workstation interface, Bhruguram et al (2012) have suggested replacing a mouse with camera and motion detection technology while retaining the conventional movements associated with a mouse. This retains the familiarity of a known paradigm rather than reinvent it from first principles. When attempting to define a new, hands-free system for basic interactions with a CAD system, Jeong et al (2012) utilized simple static gestures based on a number of fingers for selection, translation, etc. although these cannot be considered to be intuitive. Despite research on set-ups and applications of GCUIs, there is less understanding as to what gestures should be employed and why.

Complete Chapter List

Search this Book:
Reset