Panoramic Street-View Exploration using a Multi-Display Mobile Application

Panoramic Street-View Exploration using a Multi-Display Mobile Application

Vlad Stirbu, Petros Belimpasakis
Copyright: © 2013 |Pages: 14
DOI: 10.4018/jhcr.2013010101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In this paper the authors experiment with multi-display mobile applications that can be used in an environment where multiple smart phones are co-located within the same physical space. Utilizing Remote User Interface interaction metaphor and the REST architectural style they propose a solution that follows the Remote Model-View-Controller model, in such a way that client devices do not need to have application specific software pre-installed. The authors demonstrate the system with the Panorama Bricks application, for displaying, in a multi-display expanded view, street-view style mirror-world panoramas, in a synchronized manner. The architecture proves that such enhanced application scenarios are possible to implement even today, utilizing off-the-shelf mobile smart phones. Their evaluations prove that responsiveness levels are high, even in scenarios where multiple objects are overlaid on top of the mirror-world panoramas.
Article Preview
Top

Introduction

Organizing information in a spatial manner for application domains such as cartography, remote sensing, land surveying, urban planning, navigation, is typically handled by Geographical Information Systems (GIS) that rely on sophisticated geometric spatial data models of the environment. The data exposed by such system is often visualized through an application or a service User Interface (UI). In its simplest form, a map view is used as a basic UI component in GIS systems. Google Maps is such a popular example for visualizing geo-spatial data. Through easy to use APIs visualizing information on top of a map has been used widely in many modern services for geo-referencing user-generated content, Points of Interest (POIs), or other digital data. The term Mixed Reality (MR) refers to the merging of real and virtual worlds to produce new environments and visualizations where physical and digital objects co-exist, linked to each other. The term Mixed Reality has been defined by Milgram and Kishino (2007) with the help of the concept of Virtuality Continuum (VC), as shown in Figure 1.

Figure 1.

Virtuality Continuum (Nokia, 2009)

jhcr.2013010101.f01

A street-level view with other content extends GIS to what Cascio, Paffendorf, Smart, Bridges, Hummel, Hursthouse, and Moss (2007) define as Mirror Worlds: informationally-enhanced virtual models or “reflections” of the physical world. In a typical mirror world application, the user is able to browse around a location, pan and move, in general explore the surroundings. As an example, the Aspen moviemap (Lippman, 1980) was produced by MIT in the late 1970s. A series of spatially structured images was used for providing an interactive navigation experience through the town of Aspen. The creation of panoramas, as 360 degree photographs, date back to the late 18th century. Panoramas allow viewer to “look around” while a moviemap allows moving around. The popular, Google Street View (Vincent, 2007) links sequences of panorama-photos created by specially equipped cars, quite similarly to the Aspen moviemap. The service is accessed via a standard web browser, or by using specific mobile application. Of course, those applications have also reached mobile devices, allowing users to explore the real world environment while they are there (in-situ), or remotely (e.g. while planning an upcoming trip). In more advanced research prototypes (Belimpasakis, Selonen. & You, 2010), such as the one shown in Figure 2, virtual content can also be placed on the surrounding buildings, thus allowing a more immersive mixing for real and digital content, an experience that fits under the Augmented Virtuality domain.

Figure 2.

Mirror world city exploration application

jhcr.2013010101.f02

As the street-view images are getting of higher resolution and the overlaid digital content becomes denser, consuming/experiencing it on a mobile device is getting harder. Mobile devices typically have small screens, thus the full potential of the data is constrained by the limited area of rendering. We experiment with a system that would allow consumption of mirror world content in a multi-display fashion, allowing users to pair two or more (mobile or desktop) devices, in such a way that the display and rendering area is wider. That also allows the mirror world exploration experience to become more social, as two people could get together and link their devices for getting an overall better experience. For joining devices we utilize remote UI techniques, which allow the “server” device to expose parts of the imagery to other “client” devices, in a synchronized manner. We create a system is such a way that clients can be “generic” and not hold any application specific logic.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing