Distributed Intelligence Platform to the Edge Computing

Distributed Intelligence Platform to the Edge Computing

Xalphonse Inbaraj
Copyright: © 2020 |Pages: 23
DOI: 10.4018/978-1-7998-0194-8.ch007
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

With the explosion of information, devices, and interactions, cloud design on its own cannot handle the flow of data. While the cloud provides us access to compute, storage, and even connectivity that we can access easily and cost-effectively, these centralized resources can create delays and performance issues for devices and information that are far from a centralized public cloud or information center source. Internet of things-connected devices are a transparent use for edge computing architecture. In this chapter, the author discusses the main differences between edge, fog, and cloud computing; pros and cons; and various applications, namely, smart cars and traffic control in transportation scenario, visual and surveillance security, connected vehicle, and smart ID card.
Chapter Preview
Top

Introduction

Cloud computing frees the enterprise and also the user from the specification of the many details. This blissfulness becomes a retardant for latency-sensitive applications, which need nodes within the neighborhood to satisfy their delay necessities. An rising wave of web deployments, most notably the web of Things (WoTs), requires mobility support and geo-distribution in addition to location awareness and low latency. We argue that a replacement platform is required to satisfy these requirements; a platform we have a tendency to decision Fog Computing, or, briefly, Fog could be a cloud close to the ground (Bonomi, Milito & Natarajan, 2014). With the explosion of data, devices and interactions, cloud architecture on its own can't handle the influx of information. While the cloud gives us access to compute, storage and even connectivity that we can access easily and cost-effectively, these centralized resources can create delays and performance issues for devices and data that are far from a centralized public cloud or data center source.

Edge computing—also known as just “edge”—brings processing close to the data source, and it does not need to be sent to a remote cloud or other centralized systems for processing. By eliminating the distance and time it takes to send data to centralized sources, we can improve the speed and performance of data transport, as well as devices and applications on the edge.

Fog computing is a standard that defines how edge computing should work, and it facilitates the operation of compute, storage and networking services between end devices and cloud computing data centers. Additionally, many use fog as a jumping-off point for edge computing.

Therefore we can define some characteristics to define Fog Computing such that a) Low latency and location awareness; b) Wide-spread geographical distribution; c) Mobility; d) Very large number of nodes, e) Predominant role of wireless access, f) robust presence of streaming and real time applications, g) Heterogeneity.

Both fog computing and edge computing provide the same functionalities in terms of pushing both data and intelligence to analytic platforms that are situated either on, or close to where the data originated from, whether that’s screens, speakers, motors, pumps or sensors.

Fog computing is projected to alter computing directly at the sting of the network, which might deliver new applications and services (Forman, 2003). For example, industrial edge routers are advertising processor speed, number of cores and built-in network storage. Those routers have the potential to become new servers and also its facilities the services at the edge of the network which are known as fog nodes. They can be resource-poor devices such as set-top boxes, access points, routers (Willis, Dasgupta & Banerjee, 2014), switches, base stations, and end devices, or resource-rich machines such as Cloudlet. Obviously, edge and fog computing architecture is all about Internet of Things (IoT)that deal with remote sensors or devices are typically where edge computing and fog computing architectures manifest in the real world. In this chapter, how Fog Computing was emerged and its various applications through its edge nods was discussed. For example, in the Fog Transportation System, the fog nodes perform some native analysis for native action, like alerting the vehicle concerning poor road conditions, triggering autonomous response to slow down, and perform some autonomous functions, although connections to higher layers are inaccessible and according to surveillance scenario, video analytics algorithms are often settled on fog nodes near to the cameras, and take advantage of the heterogeneous processor capability of fog, running parts of the video analytics algorithm on conventional processors or accelerators.

Both technologies can help organizations reduce their reliance on cloud-based platforms to analyze data, which often leads to latency issues, and instead be able to make data-driven decisions faster. The main difference between edge computing and fog computing comes down to where the processing of that data takes place.

Complete Chapter List

Search this Book:
Reset