Big Data and Cloud Interoperability

Big Data and Cloud Interoperability

Ahmad Yusairi Bani Hashim (Universiti Teknikal Malaysia Melaka, Malaysia)
Copyright: © 2016 |Pages: 11
DOI: 10.4018/978-1-4666-9649-5.ch004
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

Cloud computing provides access to the high volume of data to individuals and enterprises. Immense data analytics and mobile technologies contain useful knowledge. Data storage, security, and privacy, on the other hand, are the main issues for any organization or enterprises in the business world. Big data processing is within reach due to easy access to cloud architectures and open-source software. Now, interoperability of big data and cloud has become a necessity in this era of data-intensive world.
Chapter Preview
Top

Introduction

Cloud Computing (CC) implements a shared computer resource with an Internet connection. In CC, the resources must be accessible on demand, and it must adhere to the concept of the delivery of computing as a service. Cloud computing provides access to the high volume of data to individuals and enterprises. Immense data analytics and mobile technologies contain useful knowledge. One of the advantages of cloud computing infrastructure is that it provides large-scale data storage, processing, and distribution. Also, the importance of CC technology is its potential to save charges of investment and infrastructure. The central issues for any organization or businesses in the business world are the data storage, security, and privacy. Companies utilize online file access that CC is now becoming an industry trend. Moreover, the companies need to obtain the necessary knowledge as a basis for intelligent services and decision-making systems. The intelligent services and decision-making systems are the unique features of the Big Data science. Big data processing is within reach due to easy access to cloud architectures and open-source software. Now, interoperability of big data and cloud has become a necessity in this era of the data-intensive world.

Important Aspects of Big Data and Cloud Interoperability

The are four important aspects of the big data and cloud interoperability.

  • 1.

    Structural Models

  • 2.

    Data and Semantics

  • 3.

    Interconnection, Network, and Interoperability

  • 4.

    Security, Performance, and Privacy

Structural Models

Based on the characteristics of the cloud environment, the client will be able to identify the requirement for the infrastructure and framework, the storage resource, and the associated technologies.

  • 1.

    Infrastructure Requirement and Framework: Cloud computing infrastructure provides large-scale data storage, processing, and distribution. The infrastructure allows enterprises worldwide to use the same resources without setting up a similar resource locally, therefore, reduces the energy emission. Mobile CC infrastructure could be one of the options because it provides unlimited storage with power dissipation prevention capabilities. The global carbon dioxide emissions of the information and communication technology account for four percents (Aminzadeh, Sanaei, & Ab Hamid, 2014). The global-scale allocation of computing resource could improve efficient energy usage. It also assures high performance (Ebejer, Fulle, Morris, & Finn, 2013), (Addis, Ardagna, Capone, & Carello, 2014) and (Sultan, 2013).

  • 2.

    Storage Resource: The variability in data volumes such as that due to parallel memory bottlenecks, deadlocks, and inefficiency occur in processing big graph data could cause inconsistent computing and storage requirements. The types and the speed of CC applications are examples of the runtime resource configurations (RRC). One needs to evaluate the RRC when the phenomenon of data volume variability occurs (Li, Xu, Kang, Yow, & Xu, 2014). Computationally intensive applications when running on-demand may be resolved by optimizing the computing power of CPU, and GPU-based cloud resources.

  • 3.

    Big Data and Cloud Technologies: Apache Hadoop offers a platform for distributed and parallel data processing. A Big Geo-Data Analytics is the future big data technology where it applies Hadoop clusters in connecting GIS to cloud computing environment. Map-reduce and NoSQL, on the other hand, are the examples of data architectures technology used in connecting global information system to CC environment (Gao, Li, Li, Janowicz, & Zhang, 2014).

Complete Chapter List

Search this Book:
Reset