Big Data Techniques and Applications

Big Data Techniques and Applications

Gamze Özel (Hacettepe University, Turkey)
Copyright: © 2014 |Pages: 10
DOI: 10.4018/978-1-4666-5202-6.ch032
OnDemand PDF Download:
$30.00
List Price: $37.50

Chapter Preview

Top

Background

Big data is a popular term used to describe the exponential growth, availability and use of information, both structured and unstructured. Big data is data that exceeds the processing capacity of conventional database systems (SAS, 2012). The data is too big, moves too fast, or doesn’t fit the structures of your database architectures. As seen in Figure 1, big data is currently defined using three data characteristics: volume, variety and velocity. It means that some point in time, when the volume, variety and velocity of the data are increased, the current techniques and technologies may not be able to handle storage and processing of the data. At that point the data is defined as Big Data. Therefore, big data are sometimes called the “3Vs”: more volume, more variety and higher rates of velocity (Douglas, 2001). The following three dimensions of big data define the expansion of a data set along various fronts to where it merits to be called big data.

Figure 1.

The 3 V's model is the most commonly used description of big data

(Source: http://www.thinkinc.com/blog/the-future-of-big-data-and-the-data-scientist-in-2013/)

Key Terms in this Chapter

Data Warehouse: Delivers deep operational insight with advanced in-database analytics.

Information Integration and Governance: Allows you to understand, cleanse, transform, govern and deliver trusted information to your critical business initiatives.

Hadoop: Enables distributed processing of large data sets across commodity server clusters.

Application Development: Streamlines the process of developing big data applications.

Accelerators: Speeds time-to-value with analytical and industry specific modules.

Visualization and Discovery: Helps users explore large, complex data sets.

Systems Management: Monitors and manages big data systems for secure and optimized performance.

Stream Computing: Enables continuous analysis of massive volumes of streaming data with sub-millisecond response times.

Complete Chapter List

Search this Book:
Reset