Mobile Interactions Augmented by Wearable Computing: A Design Space and Vision

Mobile Interactions Augmented by Wearable Computing: A Design Space and Vision

Stefan Schneegass, Thomas Olsson, Sven Mayer, Kristof van Laerhoven
DOI: 10.4018/978-1-5225-5484-4.ch049
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Wearable computing has a huge potential to shape the way we interact with mobile devices in the future. Interaction with mobile devices is still mainly limited to visual output and tactile finger-based input. Despite the visions of next-generation mobile interaction, the hand-held form factor hinders new interaction techniques becoming commonplace. In contrast, wearable devices and sensors are intended for more continuous and close-to-body use. This makes it possible to design novel wearable-augmented mobile interaction methods – both explicit and implicit. For example, the EEG signal from a wearable breast strap could be used to identify user status and change the device state accordingly (implicit) and the optical tracking with a head-mounted camera could be used to recognize gestural input (explicit). In this paper, the authors outline the design space for how the existing and envisioned wearable devices and sensors could augment mobile interaction techniques. Based on designs and discussions in a recently organized workshop on the topic as well as other related work, the authors present an overview of this design space and highlight some use cases that underline the potential therein.
Chapter Preview
Top

Introduction

During the development of mobile phones and other mobile information devices, also the input and output methods have gradually changed. The early mobile phones used to have only physical buttons for tactile input and a small monochrome display for visual output. Over the last decade, with the introduction of smart phones, the archetype of a mobile device has turned into a sensor-rich device that features large touch screens, greatly increased computational power, and, most importantly, built-in sensors such as accelerometers, gyroscopes, and GPS (Hinckley, Pierce, Sinclair, & Horvitz, 2000). After the touch screen revolution the sensors have enriched the interaction possibilities, allowing, for example, moving the phone in mid-air for gestural interaction or tracking users’ physical activity while having the phone in the pocket.

Despite the rapid progress, the form factor of mobile phones is still a limitation. They are hand-held devices and the main explicit input method still involves holding the phone in the one and interacting with the other hand. In other words, much of the sensors and other capabilities remain underutilized by current applications and interaction techniques, partly because of the handheld form factor.

Fortunate for developers of new interaction techniques, the rapidly evolving wearable devices are slowly entering the market with not only more and better sensors but also more opportune form factors and body locations. Wearable devices and peripherals, such as fitness bracelets, breast straps, wrist-worn devices, or head-mounted devices allow for new types of close-to-body interactions. Moving even closer to the body, smart garments allow placing sensors and actuators unobtrusively close to the human body. However, the gap between the products that arrive at the mass market and the envisioned research prototypes is still huge. Wearable computers have a history which already started back in the 1960s. Thorps wearable computer was able to calculate roulette probabilities (Thorp, 1998). Since then a number of different devices have been built realizing a variety of applications. Garments measuring the physiological properties of the user (Gopalsamy, Park, Rajamanickam, & Jayaraman, 1999), belts detecting the user’s posture (Farringdon, Moore, Tilbury, Church, & Biemond, 1999), or wearable displays showing information about the user (Falk & Björk, 1999) have all been explored in the last millennium. More than 15 years later, almost none of these prototypically developed devices achieved success in the mass market.

What is currently particularly interesting is the potential in combining wearable and hand-held devices: the hand-held smart devices have vast computation capabilities and connectivity, while the wearable sensors and actuators can be placed at various parts of the body to allow more direct, accurate and always accessible input/output. Looking at the successful devices that are currently available at the mass market such as fitness bracelets or heart rate monitoring devices, it becomes apparent that these devices are actually external sensors that increase the sensing capability of the users’ smartphone and most of the time not fully functional stand-alone systems. These devices mainly fulfill basic use-cases and applications, nowadays mainly in the fitness and eHealth domain, but are not restricted to them.

In fact, there are hundreds of smartphone applications that utilize these sensors to expand the variety of use cases and applications to different domains. To facilitate this transition, the integration from the wearable device to the user’s mobile ecosystem is one of the success criteria for wearable devices. This motivates to investigate a new design space for mobile interaction that takes into account the sensing and actuating capabilities beyond smartphones. While current smartphone applications deal with the limited sensing and actuating capabilities as well as limited placement possibilities offered by smartphones of today, wearable devices can augment these possibilities. In contrast to using touch and voice input of the device itself, an unlimited number of sensors and actuators connected to one’s mobile device can be used, allowing various novel applications and interactions to be envisioned and realized.

Complete Chapter List

Search this Book:
Reset