Nikko: A Sensor Management System for Ambient Intelligence and Urban Computing Environments

Nikko: A Sensor Management System for Ambient Intelligence and Urban Computing Environments

Guillermo Cueva-Fernandez (University of Oviedo, Spain) and Martin Gonzalez-Rodriguez (University of Oviedo, Spain)
DOI: 10.4018/978-1-61520-655-1.ch042


Nowadays smartphones are equipped with a large amount of sensors, including GPS, accelerometers, cameras, microphones, and light sensors. These sensors are the perfect way for sensing the context in which a user is located. This chapter describes a simple framework able to recognize different kind of events triggered by sensors, distributing them among interactive objects. This framework has been tested in an application able to recognize the movement of a user in indoor and outer locations, and in another one able to detect a user’s falls and car accidents analyzing the information provided by the accelerometers of a smartphone. The proposed application is an example of the framework utilization based on the combination of the existing sensors in a common smartphones. This method tries to minimize the typical error localization inaccuracy of current systems. One of the main motivations of this chapter is to demonstrate the possibility to successfully manage and add different kind of sensors to an application.
Chapter Preview

Many kinds of sensors are used for positioning technologies, including radio signal time (Cheng et al, 2004) or signal strength (Niculescu and Nath, 2003), (Nagpal, 1999), infra-red (Stoleru et al) or ultrasound (Nissanka et al, 2000) sensors. But as described by I.M. Zendjebil (Zendjebil et al 2008), the idea of joining several sensors together is not recent. Vieville et al. (1993) propose to use inertial sensor with vision for autonomous robotics. The first outdoor AR systems such as MARS (Hollerer, 1999) (Mobile Augmented Reality System) and BARS (Julier et al, 2000) (Battlefield Augmented Reality System) used a GPS to estimate the absolute position of the user, and an inertial sensor coupled with an electronic compass to estimate the orientation.

There are two strategies of combination described by Zendjebil et al (2008):

Key Terms in this Chapter

Urban Computing: Is a field that focuses on how to apply technological solutions in public environments such as cities or parks.

Ambient Intelligence: Refers to the capability of an electronic system to perceive and react to the presence of people.

Augmented Reality Systems: Refers to systems that aim to duplicate the world’s environment in a computer. These systems generate user views that are combinations of the real scenes viewed by the user and virtual scenes generated by the computer that augment the scenes with additional information.

Augmented Accessibility: Is the ability to complete the physical information available but not accessible to a disabled user with virtual information accessible through the channels of perception still available for that user; e.g. augmented accessibility describes the surrounding environment of a blind user through virtual information transmitted by voice.

Global Positioning System (GPS): Is a system involving satellites, computers and receivers that is used to determine the location (i.e., latitude and longitude) of a receiver by calculating the time difference for signals from different satellites to reach the receiver.

Accelerometer: Is a device that measures acceleration and gravity-induced forces used to detect motion and rotation.

Mobile Device: Refers to a device that allows people to access data and information from where ever they are, and that is a key component in mobile computing, a paradigm involving techniques that enable the use of technology while moving, without the need to deploy any device in a stationary configuration.

Complete Chapter List

Search this Book: