Hand Gesture Recognition as Means for Mobile Human Computer Interaction in Adverse Working Environments

Hand Gesture Recognition as Means for Mobile Human Computer Interaction in Adverse Working Environments

Jens Ziegler (Technische Universität Dresden, Germany), Randy Döring (Technische Universität Dresden, Germany), Johannes Pfeffer (Technische Universität Dresden, Germany) and Leon Urbas (Technische Universität Dresden, Germany)
DOI: 10.4018/978-1-4666-4623-0.ch017
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

Interacting with mobile devices can be challenging in adverse working environments. Using hand gestures for interaction can overcome severe usability issues that users face when working with mobile devices in industrial settings. This chapter is dedicated to the design, implementation, and evaluation of mobile information systems with hand gesture recognition as means for human computer interaction. The chapter provides a concise theoretical background on human gestural interaction and gesture recognition, guidelines for the design of gesture vocabularies, recommendations for proper implementation, and parameterization of robust and reliable recognition algorithms on energy-efficient 8-bit architectures. It additionally presents an elaborated process for participatory design and evaluation of gesture vocabularies to ensure high usability in the actual context of use. The chapter concludes with a case study that proves the suitability of the proposed framework for the design of efficient and reliable hand gesture-based user interfaces.
Chapter Preview
Top

Introduction

Mobile IT-supported workis becoming a key competitive advantage in factories of the future. As nearly the entire enterprise data is available in digital form, business processes and workflows increasingly rely on information technology. This dependency is obvious in office work, but is also increasing for mobile work. Maintenance personnel, engineers, site managers, transporters or building inspectors – the entire mobile workforce of a company requires information to do their job. Mobile information systems provide data, workflow support, documentation and reporting services, access to plant equipment or the control system, and even augmented reality for specific purposes. Coming from the area of mobile consumer products, prominent pioneers of mobile interaction have helped to bring mobile devices of all sizes in all areas of daily life, and increasingly into office routine. Today, direct interaction styles such as touch-and-swipe interaction, speech recognition and tilt interfaces supplemented by haptic or acoustic feedback, and also combined as a multi-modal user interface, dominate interaction with mobile devices. The mentioned user interfaces enable users to choose the most appropriate interaction style for their specific context of use. Because of their widespread use, these concepts have usually been directly adopted for mobile information systems. However, this has quickly led to problems, as the demands on robustness and usability are significantly different from those for the consumer market. For example, adverse environmental conditions with challenging features such as changing light, high humidity, dirt, dust, grease and liquids require protective clothing including helmet and working gloves. Users have to use one or even both hands to accomplish various physical tasks at least temporarily. Long distances between different work locations lead to occasional and frequently interrupted usage of the system. Mobile information systems need to be operable under all these conditions. When conventional user interfaces fail, they must be substituted by alternatives that have been designed for the particular context of use. Alternative visual displays such as head-mounted displays have already been successfully tested and implemented (e.g. Witt, 2007). However, more interaction concepts for input devices need to be developed and evaluated in industrial environments. Without such input devices, reliable mobile IT-support under adverse environmental conditions will remain impossible.

This book chapter is devoted to hand gesture recognitionusing on-body sensors, a promising input style for working environments where the above-mentioned ways of interaction are impossible or improper, e.g. in cleanrooms in the semiconductor industry, process plants or factories in the manufacturing industry. Hand gesture recognition has the potential to meet the requirements of professional working environments. It allows one-handed work while operating the mobile device and two-handed work when not. Being wearable, self-contained, lightweight and robust, it forms a reliable, efficient and inexpensive intelligent user interface. Especially wearable systems with head mounted displays, pico projectors or wearable flexible display screens can benefit from hand gesture based interaction.

The remainder of this book chapter is structured as follows. In section Hand Gesture Based Interaction with Mobile Devices, a concise discussion of the theoretical foundations of human gestural interaction and gesture recognition highlights current challenges and gives valuable guidelines for the design of gesture recognition systems and gesture vocabularies. A literature review grounds the topic in the current state of research on hand gesture recognition using Hidden Markov Models (HMM). Section Gesture Recognition using Hidden Markov Models goes into the details of HMM based gesture recognition algorithms and gives recommendations for proper implementation and parameterization of robust and reliable recognition algorithms on energy-efficient 8-bit architectures. The following section Case Study: A wearable dynamic hand gesture recognition system presents a case study which was conducted as part of a development project to demonstrate the suitability of the proposed framework for the design of efficient and reliable hand gesture based user interfaces. The section provides a description of an actual implementation of a hand-gesture based user interface for a wearable system and a two-step participatory design process (Schuler & Namioka, 1993) for gesture vocabularies used in mobile support applications. The chapter concludes with an outlook on future research directions.

Complete Chapter List

Search this Book:
Reset