Real-Time Query Processing on Live Videos in Networks of Distributed Cameras

Real-Time Query Processing on Live Videos in Networks of Distributed Cameras

Rui Peng, Alex J. Aved, Kien A. Hua
DOI: 10.4018/jitn.2010010103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

With the proliferation of inexpensive cameras and the availability of high-speed wired and wireless networks, systems of distributed cameras are becoming an enabling technology for a broad range of interdisciplinary applications in domains such as public safety and security, manufacturing, transportation, and healthcare. Today’s live video processing systems on networks of distributed cameras, however, are designed for specific classes of applications. To provide a generic query processing platform for applications of distributed camera networks, the authors designed and implemented a new class of general purpose database management systems, the live video database management system (LVDBMS). The authors view networked video cameras as a special class of interconnected storage devices, and allow the user to formulate ad hoc queries expressed over real-time live video feeds. This paper introduces their system and presents the live video data model, the query language, and the query processing and optimization technique.
Article Preview
Top

1. Introduction

Cameras are a special class of sensors that are widely used in many applications ranging from traffic monitoring, public safety and security to healthcare and environmental sensing. They generate huge amounts of data in the form of live video streams. In contrast to entertaining movies, such live video captured by specialized cameras are generally not as interesting. People who constantly monitor these cameras, such as baggage screeners at airports, can quickly become fatigued. Moreover, when tens of thousands of cameras are present, such as those deployed on streets in a city, it is expensive, if physically feasible, to house a huge number of monitors to support the various monitoring operations. In many scenarios, critical events do not happen very often, and it is quite inefficient to have one monitor tracking only a few cameras constantly and intensely. With the recent advances in VLSI technology, networks of distributed cameras, consisting of a large number of inexpensive video cameras and distributed processors, will bring far greater monitoring coverage. It will become virtually impossible for human beings to keep track of all the objects or events under each camera. The distributed nature of these networks, coupled with the real-time nature of live videos, greatly complicates the development of techniques, architectures, and software that aim to effectively mine data from a sea of live video feeds. In this paper, we propose a Live Video Database Management System (LVDBMS) as a solution to address the aforementioned problems. LVDBMS is designed to allow users to easily focus on events of interest from a multitude of distributed video cameras by posing continuous queries on the live video streams. With LVDBMS automatically and continuously monitoring live videos feeds and taking care of the intercommunications among distributed processors, a user can easily use a desktop display to manage a very large number of cameras and receive notifications when critical events happen.

We introduce a novel concept called Live Video Databases as a new class of databases built upon a multitude of real-time live video streams. The fundamental difference between a video database and a live video database is as follows. While the former deals with stored video files, the latter deals with real-time video data, streaming from cameras treated as a special class of “storage” devices. In this paper, we focus on the continuous query model and distributed query processing techniques for spatiotemporal live video databases. The query processing is carried out in three main stages: query registration, query decomposition, and query execution. In the first stage, a user poses a continuous query by registering a predicate and an action with the database management system, which will evaluate the condition continuously in real time. An event-based textual query language is proposed for users to specify spatiotemporal queries on live video streams. The language offers sufficient expressiveness for defining events based on spatiotemporal relations between objects across multiple live videos and allows users to specify certain actions upon detecting such an event. At the second stage, a complex query, possibly spanning multiple live video sources, is decomposed into single-video sub-queries. In the last stage of query processing, sub-queries are evaluated on individual live videos in real time. The results from the sub-queries are synthesized to form the final answer, which indicates if the user-specified event has occurred. Once the predicate is satisfied, it triggers the LVDBMS to execute the user specified actions (e.g., notify the user and archive relevant video clips).

The main contributions of this paper are:

  • A spatiotemporal data model for live video streams and events of interest from distributed cameras

  • An event-based query language for composing spatiotemporal queries, and

  • Query processing techniques for computing query results in a network of cameras

The remainder of this paper is organized as follows. In Section 2, we discuss some related work including traditional video DBMS and continuous query processing techniques. We introduce the proposed techniques for live video database management systems in Section 3. The design and implementation of the LVDBMS prototype is presented in Section 4. Finally, we offer our conclusions and discuss future work in Section 5.

  • IGI Global’s Fifth Annual Excellence in Research Journal Awards
    IGI Global’s Fifth Annual Excellence in Research Journal AwardsHonoring outstanding scholarship and innovative research within IGI Global's prestigious journal collection, the Fifth Annual Excellence in Research Journal Awards brings attention to the scholars behind the best work from the 2012 copyright year.

Complete Article List

Search this Journal:
Reset
Volume 16: 1 Issue (2024)
Volume 15: 1 Issue (2023)
Volume 14: 1 Issue (2022)
Volume 13: 4 Issues (2021)
Volume 12: 4 Issues (2020)
Volume 11: 4 Issues (2019)
Volume 10: 4 Issues (2018)
Volume 9: 4 Issues (2017)
Volume 8: 4 Issues (2016)
Volume 7: 4 Issues (2015)
Volume 6: 4 Issues (2014)
Volume 5: 4 Issues (2013)
Volume 4: 4 Issues (2012)
Volume 3: 4 Issues (2011)
Volume 2: 4 Issues (2010)
Volume 1: 4 Issues (2009)
View Complete Journal Contents Listing