Overview of Computing Models

Overview of Computing Models

DOI: 10.4018/978-1-5225-2193-8.ch001
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Chapter Preview

Top

Introduction

From the early era of computing, centralization was the most preferred approach as it provided a sense of control and authority over the individuals and resources for the concerned. Historically, most of the organizations preferred centralization in their process flow to maintain the control within a limited group, and thus possibly retain the knowledge edge or monopoly in their respective business domains. Centralized Computing refers to the allocation of all computing/storage resources to a single unit which controls and facilitates all the computing/storage services for the organization. The focus of control in centralization is typically implemented with a server as the central node with appropriately networked low end computers as client terminals (Click et al., 2006). The centralized computing paradigm does the processing at the central server and all the client terminals can possibly act as connected thin clients. The thin client terminals will be having very limited or no computing capability. It is an attempt to improve efficiency by taking advantage of the potential economies of scale: improving the average; it may also improve reliability by minimizing the opportunities for failures/errors.

In earlier times, most of the organizations preferred centralized environment with mainframe computing and hence created large data centers within their control for physical and location consolidation. Hosting the data and services through a mainframe system ensured greater knowledge consolidation and protection for the organizations in their businesses (King, 1983). It also provided better staffing with talented brains focusing on the central servers for efficient and consistent performance. Centralization can create a simpler, easier-to-manage architecture which enable more standardization, control, and efficiency. But there should be well defined responsibilities and communication channel for all the nodes connected to central server for its proper working.

Figure 1.

Centralized computing framework

978-1-5225-2193-8.ch001.f01

Centralized computing offers data integrity and avoids data redundancy which has predominant applications in most of the critical systems. It also helps to reduce the learning time for various processes as only a central server needs to be trained on the modifications. Centralization cuts the hardware and software licensing costs by the consolidation of requirements. The most prominent limitation of a centralized network is bandwidth. Since all data passes through a central node, that node is under a lot of pressure. Further, centralized systems entail a high initial cost disadvantage also (Corridori, 2012). They require costly infrastructure and a pool of experienced professionals for initial setup and maintenance. Moreover, failure to central server can make the entire system inoperable.

Top

Rise Of Decentralized Computing Platforms

The idea of decentralization started way back from the era of human communication where human share information among each other to ensure that the knowledge does not get extinct by the loss of a single individual. Ancient tribes shared the vital information needed for their survival across generations in a decentralized manner and paved the way for the evolution of mankind. At the base level, decentralization is the simple process of taking information and distributing copies across the network so as to enhance security, redundancy, trust, accessibility, and congruence. Decentralized computing distributes computing capabilities to many nodes spread across the network. The network consists of a collection of autonomous computers that may be geographically separated, but can communicate with each other across a network to provide a common, agreed upon service (Suryanarayana, 2006). The concept of decentralization kick-started in the computing paradigm after realizing its potential in accelerating the decision-making processes for faster and efficient problem solving in diverse business domains. In a generic sense, the decentralization takes over the control from the governments and central entities to the peer-to-peer network with immutable mathematical logic that provides better security, efficiency and resilience with reduced timeframe and overheads.

Complete Chapter List

Search this Book:
Reset