Cloud computing is simply considered the realization of the long-held dream of using computing resources in the same way as accessing public utilities. Although the term “cloud computing” has been added to the IT jargon for about four years, many people are still in doubt as to what its actual meaning is. Some people even argue that cloud computing might just be an old technology under a new name. Many questions are raised when it comes to this subject. Why cloud computing? Is it the same thing as web hosting on third party servers? What is the difference between it and other popular terms such as grid computing? Why should organizations consider it? And, is it risk-free? IT, business, and academia folks are continuously asking about cloud computing with the intention of better understanding and realizing it. This chapter tries to demystify cloud computing by means of introducing and simplifying its terms to readers with different IT interests.
Cloud Computing, The Beginning
Accessing computing resources in an easy manner as that of accessing water and electricity is a few-decades-old dream, which is still waiting to be achieved. Professor John McCarthy, a well-known computer scientist who launched time-sharing in late 1957, laid the groundwork for cloud computing by anticipating that corporations would sell resources by means of the utility business model (McCarthy, 1983). Soon after that, various firms started paying for their own use of computing resources, such as storage, processing, bulk printing, and software packages available at service bureaus.
A desire to allow customers to outsource processing and storage for their information has triggered some cloud-like implementations. Some of these implementations were initiated during the last two decades by providing the public with enormous IT infrastructure for their use. Such modules include: