Big Data and Cloud Computing: A Technological and Literary Background

Big Data and Cloud Computing: A Technological and Literary Background

Reinaldo Padilha França, Yuzo Iano, Ana Carolina Borges Monteiro, Rangel Arthur
Copyright: © 2021 |Pages: 22
DOI: 10.4018/978-1-7998-2791-7.ch002
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Big data is the analysis and interpretation of large volumes of data of great variety. This requires big data-specific solutions that enable IT professionals to work with unstructured information at a great speed. Cloud computing is the on-demand delivery of computing power, database storage, applications, and other IT resources over the internet with a price set according to usage. The relationship between cloud and big data is narrow, where the first is the infrastructure that, in a corporate environment, supports the second, which has enough capacity to process data in large numbers. The connection between these concepts is due to the fact that, in order to deal with big data, an infrastructure is needed that allows the storage, processing, and retrieval of the most varied types of data on a large scale, that is, it has a constant growth. Therefore, this chapter aims to provide an updated review of big data and cloud computing, showing and approaching its success relation with a concise bibliographic background, categorizing and synthesizing the potential of both technologies.
Chapter Preview
Top

Introduction

Charles Babbage, considered the father of today's computer, built in 1830 the world's first computer, a hundred years before it came true. Babbage's project had drawbacks; One was the fact that his computer must be mechanical, and the other was the precarious engineering of the time. Despite the problems, Charles Babbage built a device that impressed the English government (Dawson, 2017) (Berg, 2016).

However, the history of computing began much earlier. As we know, the computer is a machine capable of performing calculations with one group of numbers and yet adaptable to perform new calculations with another group of numbers. The first “model” was the abacus, used since 2000 BC. It is a type of computer where you can clearly see the sum on the wires. In 1943 during World War II the ENIAC was developed, which weighed 30 tons and was 5,5 meters high, 25 meters long and 70,000 resistors and 17,468 valves (Haigh, Priestley, Priestley et al, 2016).

The ENIAC was a great calculating machine and based its structure on scientific advances already developed, such as Charles Babage's sophisticated mathematical calculating machines, Blaise Pascal, Leibniz and Charles Xavier Thomas mechanical calculators. ENIAC was the inspiration for many other computers that followed, such as: EDVAC (Electronic Discrete Variable Computer); ORDVAC (Ordnance Variable Automatic Computer; SEAC (Standards Automatic Computer)) and UNIVAC, the latter also built by Eckert and Mauchly for the processing of census data of the American population (Haigh, Priestley, Priestley et al, 2016) (Meysenburg, 2018) (Haigh, Priestley, & Rope, 2016).

By 1955, a computer weighed only 3 tons and consumed 50 kwatt of power, costing $ 200,000. Such a machine could do 50 multiplications per second. Thus, the first computers were also machines that were only within the reach of large companies or institutions that had very demanding calculation needs and that had the economic conditions for such a large investment. In the mid-1970s computers began to be increasingly affordable. In 1981 IBM launched the PC (Personal Computer) on the market. The PC was distinguished from existing machines until then by being aimed at individual users who could have on their desk a machine for exclusive use, when until then this concept was not. existed. The computers were mainframe, centralized, and users had only one monitor and one keyboard and all the processing performed on the server (Wadhwani et al., 2015) (Pyburn, 1986).

Around the same time the internet was created in the United States. Called Arpanet, its function was to link research laboratories. That year, a professor at the University of California gave a friend at Stanford the first email in history. This network belonged to the US Department of Defense. The world was at the height of the Cold War. Arpanet was a guarantee that communication between the military and scientists would persist even in the event of bombing. These were points that worked regardless of whether they had problems (Hauben, 2007).

From 1982, the use of Arpanet became larger in the academic field. Initially, use was restricted to the US, but expanded to other countries, such as the Netherlands, Denmark, and Sweden. Since then, the name internet has started to be used. In 1987, it was first released for commercial use in the US. In 1992, several Internet service providers began to emerge. In the same year, the European Particle Physics Laboratory (CERN) invented the World Wide Web, which began to be used to make information available to any Internet user. Since then, the network has been widespread (Hauben, 2007) (Walden, 2019).

We deal with data since the dawn of humanity, as can be seen, and found in historical records, even more in the present times, computational advances allow us to store, organize, work and analyze data much easier and much more often. Thus, in modern times, the existence of cars, refrigerators and wearable devices connected to each other is already a reality, generating even more data to be processed and transformed into useful information. Looking at modern times, we clearly see a big change compared to previous decades, of course is under the internet, think of the amount of data that is generated daily only on social networks; note the huge amount of websites on the web, realize the ability to shop online even from your mobile phone, when the most computerization stores had in the not too distant past were isolated systems to manage their physical establishments (Longo & Drazen, 2016).

Complete Chapter List

Search this Book:
Reset