Fog Computing and Virtualization

Fog Computing and Virtualization

Siddhartha Duggirala
Copyright: © 2018 |Pages: 12
DOI: 10.4018/978-1-5225-5649-7.ch010
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

The essence of Cloud computing is moving out the processing from the local systems to remote systems. Cloud is an umbrella of physical/virtual services/resources easily accessible over the internet. With more companies adopting cloud either fully through public cloud or Hybrid model, the challenges in maintaining a cloud capable infrastructure is also increasing. About 42% of CTOs say that security is their main concern for moving into cloud. Another problem which is mainly problem with infrastructure is the connectivity issue. The datacenter could be considered as the backbone of cloud computing architecture. As the processing power and storage capabilities of the end devices like mobile phones, routers, sensor hubs improve we can increasing leverage these resources to improve your quality and reliability of services.
Chapter Preview
Top

Introduction

Cloud computing has completely transformed how businesses function and handle their IT infrastructures. By consolidating all the available resources and providing software defined resources based on the demand has been an efficiency driver. The main reasons the cloud computing really took of are the resource utilisation, efficiency, on demand resource delivery and financial benefits associated with them.

Up until the recent years the processing power, storage available at the end points like user PCS, embedded devices room mobile phones are limited. So, it made logical sense to move the burden of processing and storage to the cloud. An effective example of this is Chromebook from Google or any one of the plethora of cloud services we use every day. The only big disadvantage of these services or products is that they are completely network dependent. Heavy usage of network bandwidth and latency expectations place higher demands on the network infrastructure. This sometimes reduces the quality of experiences for the end users and in extreme cases can even be fatal.

Right now in 2017, there are about 2 devices connected to internet per every human on Earth and the number of devices estimated to be connected to internet is estimated to be 50 billion by 2020. These include the mobile phones, smart routers, home automation hubs, smart industrial machines, sensors smart vehicles (Hou, Li, Chen et al., 2016) and the whole gamut of smart devices. To give an idea of how much needs to be pushed through the Internet due to these devices, Boeing flight generate about 1 TB of data or even more for one hour of operation, the weather sensors generate about 500gb of data per day. Our mobile phone sensors are capable of generating more than 500mb of logs per data and that multiplied by number of smart phones is staggering amount of data. This along with the increasing rich media usage in the Internet will be a huge challenge for the next generation networks.

As the processing power and storage capabilities of the end devices like mobile phones, routers, sensor hubs improve we can increasing leverage these resources to improve your quality and reliability of services. Processing or even caching the data near wherever it is generated or utilised frequently not only of loads of the burden on the networks but also improves the decision making capabilities for commercial or industrial instalments, quality of experience for personal usage.

Handling this new generation of requirements of volume, variety and velocity in IOT data requires us to evaluate the tools and technologies. For effective implementation of these use cases places the following requirements on the infrastructure:

  • 1.

    Minimise Latency: Milliseconds, even microseconds matter when you are trying to prevent a failure at a nuclear power station, or preventing of some calamity or to make a buyable impression on a customer. Analyzing data and gaining actionable insights are near as the device itself makes all the difference between a cascading system failure and averting disaster.

  • 2.

    Optimising Network Utilisation: Data generated by the sensors is huge. And not all the data generated is useful. It is not even practical to transport this vast amount of data to centralised processing stations/Data centre nor is it necessary.

  • 3.

    Security and Privacy: Data needs to be protected both in transit and at rest. This requires efficient encryption, monitoring and automated response in case of any breach (Stojmenovic & Wen, 2014).

  • 4.

    Reliability: As more and more intelligent systems are deployed, their effect on the safety of citizens and critical infrastructure cannot be undermined.

  • 5.

    Durability: As the devices themselves can be deployed across wide area of environment conditions. The devices themselves need to be durable and made rugged to work efficiently in harsh environments likes railways, deep oceans, utility field substations and vehicles (Hou, Li, Chen et al., 2016).

  • 6.

    Geographic Distribution and Mobility: The Fog devices should be dispersed geographically as to provide the storage and processing resources to the sensors/actuators producing and acting based on the decisions made. The sensors themselves can be highly mobile. The fog environment should be able to provide consistent resources even in this highly dynamic scenarios. This is especially the case with Wireless sensor area networks, Personal body area networks, Vehicular area network (MANET/VANET).

  • 7.

    Interoperability: The fog devices are intended to be connected to all sorts of devices. Many of these devices have proprietary communication protocols and are not based on IP. In these cases, the fog nodes should be able to communicate and even translate them to IP protocols incase the data needs to be pushed to cloud.

Complete Chapter List

Search this Book:
Reset