Mobile Interaction in Real Augmented Environments: Principles, Platforms, Development Processes and Applications

Mobile Interaction in Real Augmented Environments: Principles, Platforms, Development Processes and Applications

Bertrand David (LIESP Lab, Ecole Centrale de Lyon, France) and René Chalon (LIESP Lab, Ecole Centrale de Lyon, France)
DOI: 10.4018/978-1-61520-655-1.ch032
OnDemand PDF Download:


In this chapter the authors describe a real augmented environment and its associated mobile interactions based on wearable computers with appropriate interaction devices that can be either classical computer interaction devices or real objects augmented with computer interfaces called tangible objects. After presenting the main principles, they describe a concrete platform, related MDA development processes and give several applications. These examples are contextual collaborative maintenance of industrial appliances and associated just-in-time mobile learning and nutritional coaching system supporting practice and learning of management of nutritional decisions in relation to specific requirements in health or high-level sport.
Chapter Preview


As announced by Weiser in 1991 (Weiser, 91), ubiquitous computing (also known as pervasive computing) seems to have taken on concrete form with the massive propagation of mobile and connected devices (e.g. PDA, Smartphone, etc) and the increasing everyday use of computer resources such as RFID tags (Srivasta, 2005). Since 2001, ubiquitous computing has been considered as integral part of Ambient Intelligence (AmI) (Ambient, 2005), which merges “ubiquitous computing” and “social user interfaces” to adapt user interfaces to an environment and task context, thereby creating proactivity. On the other hand, Mixed Reality (Milgram et al.,1995, Renevier et al. 2002) better known as Augmented Reality (AR), which was first described in 1993 by Wellner (Wellner et al., 1993), is also in the process of expansion. AR attempts to merge physical and numerical worlds to facilitate the user’s task with new devices and specific interaction techniques (i.e. a physical block controls a numerical block). However, the User Interface used on these new mobile and connected devices is similar to that of a desktop computer and is often inappropriate for mobile users who have to perform several tasks simultaneously (talking with other people, technical equipment maintenance, tourist spot visit, etc). We note that even though these devices can be sensitive to the environment (GPS, RFID tags detection, etc.), they rarely ensure that the user benefits from this contextual knowledge. Thus, we must proactively adapt their behavior without reference to the user as in an Ambient Intelligence Environment (Ambience, 2005). AR devices and techniques can be particularly useful in this respect.

Our aim is to study, through Ubiquitous Computing and Mixed Reality domains, some innovative human computer interfaces (Beaudouin-Lafon, 2000). These interfaces would be appropriate for mobile users working in a collaborative context-aware manner with access to contextual and/or personal precise data in a Computer Augmented Environment within the domains of Augmented Reality and Ubiquitous Computing. Our main concepts are:

  • MoUI (Mobile User Interfaces) which are user interfaces for wearable computers,

  • CAE (Computer Augmented Environments) in the sense of Mixed Reality and Ubiquitous Computing,

  • MOCOCO (MObility, COoperation, COntextualisation) denoting tasks performed collaboratively by several mobile actors, who have access to precise and contextualized data,

  • Proactivity, the transparent user interface adaptation enabled by an Ambient Intelligence Environment.


Imera Platform

For our studies we defined an IMERA platform (French acronym for Computer Augmented Environment for mobile interaction). This platform consists of a working area and two or three remote workspaces. The working area where different actors are located is a CAE (Computer Augmented Environment). For us this CAE is defined as a working area covered by a WiFi network, able to receive signals from RFID tags, either freely set or integrated into real objects located in this space. This description characterizes our first support for the ambient intelligence environment. Some RFID fixed readers can also be introduced to this area. The actors move freely in this area with their wearable computers where each of them is equipped with a WiFi card and RFID reader allowing them to be connected to the network and to access contextual data through the RFID readers. The WiFi network allows actors to be both connected with one another as well as with back-office systems (database servers, etc) so they can access large amounts of data.

Key Terms in this Chapter

Tangible User Interface (TUI): Is a user interface in which a person interacts with digital information through the physical environment. The initial name was Graspable User Interface, which no longer is used.

Contextual Cooperative Mobile (COCOMO) Learning: Situated just-in-time learning allowing contacting in real time a teacher to cooperate with him.

Augmented Reality (AR): Augmented reality (AR) is a term for a live direct or an indirect view of a physical, real-world environment whose elements are augmented by computer-generated sensory input, such as sound or graphics. It is related to a more general concept called mediated reality, in which a view of reality is modified (possibly even diminished rather than augmented), by a computer.

Interaction Tasks, Techniques and Atoms: Device dependent and device independent interaction.

IMERA Platform: French acronym for Mobile Interaction in Real Augmented Environment Platform taking into account augmented actors, augmented environment and augmented appliances.

Ambient Intelligence (AmI): In computing, ambient intelligence (AmI) refers to electronic environments that are sensitive and responsive to the presence of people. In an ambient intelligence world, devices work in concert to support people in carrying out their everyday life activities, tasks and rituals in easy, natural way using information and intelligence that is hidden in the network connecting these devices.

MOCOCO: Acronym for Mobility – Cooperation – Contextualization.

Mobile Learning (ML): Learning in mobility using wearable computer and appropriate interaction devices.

Mobile Interaction (MI): User interaction in mobility based on wearable computer with appropriate interaction devices.

Wearable Computer (WC): Wearable computers are computers that are worn on the body. They are especially useful for applications that require computational support while the user’s hands, voice, eyes, arms or attention are actively engaged with the physical environment.

MDA Based Approach: Model Driven Architecture – Engineering and Development used in application development.

Ubiquitous Computing (UC): Ubiquitous computing is a post-desktop model of human-computer interaction in which information processing has been thoroughly integrated into everyday objects and activities.

Domain Specific Languages: MDA description languages devoted to specific domains.

Capillary Cooperative System (CCS): We use this term by analogy with the network of blood vessels. The purpose of the Capillary CS is to extend the capacities provided by co-operative working tools in increasingly fine ramifications; hence they can use fixed workstations and handheld devices.

Real Augmented Environment (RAE): Real Environment with appropriate wireless captors and actuators.

Complete Chapter List

Search this Book: