Big Data Analysis in IoT

Big Data Analysis in IoT

Aqeel-ur Rehman, Rafi Ullah, Faisal Abdullah
Copyright: © 2017 |Pages: 15
DOI: 10.4018/978-1-5225-1832-7.ch018
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

In IoT, data management is a big problem due to the connectivity of billions of devices, objects, processes generating big data. Since the Things are not following any specific (common) standard, so analysis of such data becomes a big challenge. There is a need to elaborate about the characteristics of IoT based data to find out the available and applicable solutions. Such kind of study also directs to realize the need of new techniques to cope up with such challenges. Due to the heterogeneity of connected nodes, different data rates and formats it is getting a huge challenge to deal with such variety of data. As IoT is providing processing nodes in quantity in form of smart nodes, it is presenting itself a good platform for big data analysis. In this chapter, characteristics of big data and requirements for big data analysis are highlighted. Considering the big source of data generation as well as the plausible suitable platform of such huge data analysis, the associated challenges are also underlined.
Chapter Preview
Top

Introduction

Internet of Things (IoT) is a concept of providing uniquely identifiable objects connectivity to Internet. When billions of things connect, it will be difficult to manage and analysis huge amount of data as each object will send and retrieve data. Many challenges are related with analysis of big data on IoT due to the heterogeneity, variable data formats, priorities and specifically numerous numbers of connected devices.

Big data actually refers to huge amount of data. It includes all type of data. The data is traditionally collected and then processed and move to data warehouse for analysis. When a large amount data is collected from different sources, it may not necessarily relational data. This data can be treated as big data. As data is increasingly becoming more varied, more complex and less structured, it has become imperative to process it quickly. Meeting such demanding requirements poses an enormous challenge for traditional databases and scale-up infrastructures. Big Data refer to new scale-out architectures that address these needs.

In IoT, data management is a big problem due to the connectivity of billions of devices, objects, processes generating big data. Since the Things are not following any specific (common) standard, so analysis of such data becomes a big challenge. There is a need to elaborate about the characteristics of IoT based data to find out the available and applicable solutions (Shi & Liu, 2011). Such kind of study also directs to realize the need of new techniques to cope up with such challenges.

Big Data

Big data actually refers to more data or huge amount of data. It includes all type of data. The data is traditionally collected. And then processed and move to data warehouse for analysis. When a large amount data is collected from different sources, it may not necessarily relational data. This data can be treated as big data.

As data is increasingly becoming more varied, more complex and less structured, it has become imperative to process it quickly. Meeting such demanding requirements poses an enormous challenge for traditional databases and scale-up infrastructures. Big Data refer to new scale-out architectures that address these needs (O'Leary, 2013).

Big Data is characterized by its variety of attributes as follows that is often referred to as a multi-V model (Assunção, Calheiros, Bianchi, Netto, & Buyya, 2014).

  • Variety: Data types

  • Velocity: Data production and processing speed

  • Volume: Data size

  • Veracity: Data reliability and trust

  • Value: Worth derived from exploiting Big Data

Big Data is presenting a complex range of analysis and use problems. These can include (Villars, Olofson, & Eastwood, 2011):

  • Having a computing infrastructure that can ingest, validate, and analyze high volumes (size and/or rate) of data

  • Assessing mixed data (structured and unstructured) from multiple sources

  • Dealing with unpredictable content with no apparent schema or structure

  • Enabling real-time or near real-time collection, analysis, and answers

Complete Chapter List

Search this Book:
Reset