Unobtrusive Interaction with Mobile and Ubiquitous Computing Systems through Kinetic User Interfaces

Unobtrusive Interaction with Mobile and Ubiquitous Computing Systems through Kinetic User Interfaces

Vincenzo Pallotta
DOI: 10.4018/978-1-60960-042-6.ch043
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Unobtrusiveness is a key factor in usability of mobile and ubiquitous computing systems. These systems are made of several ambient and mobile devices whose goal is supporting everyday life users’ activities, hopefully without interfering with them. We intend to address the topic of obtrusiveness by assessing its impact in the design of interfaces for mobile and ubiquitous computing systems. We will make the case of how unobtrusive interfaces can be designed by means of Kinetic User Interfaces: an emerging interaction paradigm where input to system is provided through coordinated motion of objects and people in the physical space.
Chapter Preview
Top

1. Introduction

During the last ten years much research has been carried out in mobile and Ubiquitous Computing (ubicomp) and Human Computer Interaction (HCI) to address the usability problems arisen by adapting old-style interaction models to new emergent interaction paradigms (see for instance, (Bellotti et al., 2002)). When HCI intersects ubicomp many assumptions made for designing interaction with ordinary computing devices are no longer valid. In mobile and ubicomp systems, computers exist in different forms and only in minimal portion as ordinary desktop computers (i.e. where interaction is performed through screens, keyboards, mice). Now the interface is distributed in space and time: motion of objects and people can be used to interact with physical places enriched with digital appliances. Moreover, these interfaces include modalities that typically are not under the conscious control of the user such as motion, gesture, heartbeat, temperature, and sweat (see for instance, (Stach et al. 2009)). Through wearable sensor and smart objects technology, all these inputs can be easily collected and used for interaction with computers.

As pointed out by (Weiser and Seely Brown, 1997), interacting with a ubicomp system should be realized through unobtrusive interfaces, more precisely, interfaces that, when used, do not capture the full attention of the user who can still use the system while performing other foreground tasks. One term denoting systems with interfaces of this type is “Calm Technology” so to stress the importance of adapting the computers and their interfaces to human pace rather that the other way around. In this vision, computers should follow users in their daily activity and be ready to provide information or assistance on demand.

Unfortunately, while widely used, the notion of unobtrusiveness has not yet been precisely defined. For someone, unobtrusiveness relates to the fact that the interface “disappears” (or its visible component fades away) when it is not used or in focus (Kim & Lee, 2009), while others, understand unobtrusiveness as the “invisibility” of the interface when it is used thus raising all the issues of user’s privacy (Beckwith, 2003). Our understanding of unobtrusiveness is rather related to the fact that obtrusive interfaces forces direct interaction with the system in many situations where the interaction could be simply avoided by inferring user’s intentions from implicit behaviour and contextual information. There is a substantial difference for an interface in not being “visible” and not “demanding attention”. Weiser’s notion of invisibility rather refers to the second aspect. Users will be always made aware that their input is being captured. However, this will be done with minimal attention or cognitive load.

As proposed by (Abowd et al., 2002), mobile and ubicomp user interfaces must provide a support for implicit input. By implicit input we mean input obtained from users by just observing their behaviour or sensing the interaction space (i.e. sensing the status of objects that the user is supposed to interact with). Differently than explicit input, implicit input does not necessarily require the conscious supervision of the user and might trigger what Alan Dix calls incidental interactions (Dix, 2002). Incidental interaction presupposes neither a precise user’s goal nor conscious attention. Rather, it happens when the system reacts to one or more ongoing user activities. Users may either become aware of the effects of incidental interactions (e.g., like when the courtesy lights are switched on when getting into a car) or they can be hidden and reflected only at system level (e.g., like when a highway transit payment is made by driving through an electronic toll collection station).

We consider here an emerging interaction paradigm in mobile and ubiquitous computing based on the Kinetic User Interfaces (KUIs) model (Pallotta et al, 2008a). KUIs will be shown to be unobtrusive because the user’s motion activity (rather than user’s tasks and goals) are taken into account for interaction. In this type of interfaces, user’s kinetic behaviour is observed by the system, which is then capable of inferring what are the user’s goals and intentions as well as the level of attention.

Key Terms in this Chapter

Attention: in the context of mobile and ubicomp systems and interfaces, attention is defined as the level of cognitive load when the user that is performing an operation on the system’s interface. High levels of attention typically entail consciousness, while lower level of attention can lead to unconscious behavior.

User Interface: the hardware and software that allow users to interact with a device or a system.

Ubiquitous Computing: this term denotes the tendency of having computing devices embedded in everyday life objects and places. Ubiquitous computing (or ubicomp) is a research discipline that focuses on the design of computing systems that can be used with in any situation regardless of where the computing devices are located.

Context-Awareness: Systems that are capable of interpreting data coming from sensors are said to be context-aware. Context is typically defined as a “situation of use” in ubicomp and interaction design. A context-aware device is capable of recognizing the situation of its use (e.g. the user, the location, the time) and adapts its behavior accordingly.

Direct Manipulation: In user interfaces, it defines the situation when input devices trigger directly the update of system’s objects. For instance, dragging a file icon into a folder icon in graphical user interfaces would move the corresponding file from one directory to another.

Interaction Modality: the type of input or output that is associated to a specific interaction with a system. For instance, text input through a keyboard and text output through a terminal is a modality for interacting with a command-based user interface.

Kinetic User Interface: user interfaces where motion of object and people is captured and used as input for interaction with computing systems.

Activity: Activity is a process in which an agent performs a coordinated sequence of actions, not necessary aimed at a precise goal. Activities are performed to maintain a state. An activity can be recognized by the emergence of action patterns.

Complete Chapter List

Search this Book:
Reset