Big Data Applications in Business

Big Data Applications in Business

Copyright: © 2019 |Pages: 30
DOI: 10.4018/978-1-5225-7609-9.ch003


Nothing seems to stop the big data revolution. At the same time a promise of a better world and anguish of a possible big brother, big data is the new reality of the digital economy: it is the new territory of development and creation of value for the companies. The opportunities seem endless, which is why we must appropriate the data to better understand and tame it, in order to prepare for the future towards which it seems to lead us. After the theory, let's go to the “fun” part with some examples of big data uses that you may know without realizing it. We will see in this chapter some examples of using big data in a dynamic improvement of the business strategy in order to generate value.
Chapter Preview


Someone is sitting in the shade today because someone planted a tree a long time ago.

Warren Buffett

With the advent of digital technology and smart devices, a large amount of varied data is being generated every day. Data volume will continue to grow and in a very real way. This widespread production of data has resulted in the age of big data that has been discussed in the previous chapter.

The data we produce, as well as other data we accumulate, constitutes a constant source of knowledge. Big data then is about collecting, analyzing and using data efficiently and quickly with new tools to gain a competitive advantage by turning data into knowledge and generate value.

Beyond what is big data and how it may impact the business context, this third chapter highlights the big data applications within businesses that have not been discussed in the previous chapters. Through this chapter, we recall the importance of big data in the business context in conducting decision-making, and the role it plays as a complement to the creation of new opportunities for businesses.

In recent years, large companies have begun collecting and analyzing very large amounts of data. Today, they realize that big data can give them a competitive advantage. One of the advantages of this phenomenon is to be able to consult huge amounts of information very quickly. And that is good because, regardless of the industry, companies have a consistent need to know what customers really think in order to track trends.

If large volumes of data are used correctly, it becomes easier to access information such as purchasing behavior or consumer preferences. It is then possible to zoom in on data segments to study them more closely.

After definitions of the different concepts in the first two chapters of this section, here is the time for the application of big data in the business context. But, it should be noticed that, in the big data universe and when working with data, the level of maturity is very different depending on the business activity and the company. The Web companies started very early as well as those which were in competition with pure players. They go through several phases, ranging from information on technologies and their integration to generalization and implementation through the implementation of Proof of Concept (PoC), tooling and deployment for some business applications.

Major players like Google, IBM, Cisco or Microsoft have invested for several years in the construction of Datacenter but have also deployed solutions dedicated to data analysis. However, new entrants are looking to take a piece of this cake that is envious. In addition, many companies operating in more traditional sectors will be able to take advantage of the big data revolution.


Walmart: Big Data At All Rays

When it comes to data; and is phenomenal amounts; it’s hard not to think about Walmart and its expertise in data management that has allowed it to optimize its distribution processes to dominate by the costs in its field of activity. In fact, Walmart is the symbol of a traditional business; as opposed to a virtual company like Google or Facebook …; where data management is based on data analysis.

All Walmart’s business decisions are based on the extraction of strategic information from data generated by the consumption habits of its customers and the products of its inventory.

No company better illustrates the advantages of leveraging massive volumes of data for competitive advantage than Walmart, which operates a data warehouse with, at last count, 583 terabytes of sales and inventory data built on a massively parallel 1,000 -processor system from data-warehouse-technology vendor Teradata, an NCR Corp. subsidiary. While some companies might consider having more than half a petabyte of data overkill, at Walmart it’s the way to do business (Babcock, 2006).

This allows Walmart to:

  • Store products or forecast the number of human resources required, based on an analysis of trends in the consumption habits of its customers;

  • Use just-in-time management methods to manage inventory procurement in partnership with suppliers, helping to keep storage costs to a minimum;

  • Know in real-time where a product is in the supply chain and how long it will take to find it on store shelves.

Key Terms in this Chapter

Artificial Intelligence: The theory and development of computer systems able to perform tasks that traditionally have required human intelligence.

Cluster Analysis: A statistical technique whereby data or objects are classified into groups (clusters) that are similar to one another but different from data or objects in other clusters.

Data Science: It is a new discipline that combines elements of mathematics, statistics, computer science, and data visualization. The objective is to extract information from data sources. In this sense, data science is devoted to database exploration and analysis. This discipline has recently received much attention due to the growing interest in big data.

Amazon Web Services (AWS): Is a comprehensive, evolving cloud computing platform provided by Web services are sometimes called cloud services or remote computing services. The first AWS offerings were launched in 2006 to provide online services for websites and client-side applications.

Analytics: Has emerged as a catch-all term for a variety of different business intelligence (BI) and application-related initiatives. For some, it is the process of analyzing information from a particular domain, such as website analytics. For others, it is applying the breadth of BI capabilities to a specific content area (for example, sales, service, supply chain and so on). In particular, BI vendors use the “analytics” moniker to differentiate their products from the competition. Increasingly, “analytics” is used to describe statistical and mathematical data analysis that clusters, segments, scores and predicts what scenarios are most likely to happen.

Algorithm: A set of computational rules to be followed to solve a mathematical problem. More recently, the term has been adopted to refer to a process to be followed, often by a computer.

Open Data: This term refers to the principle according to which public data (that gathered, maintained, and used by government bodies) should be made available to be accessed and reused by citizens and companies.

Big Data: A generic term that designates the massive volume of data that is generated by the increasing use of digital tools and information systems. The term big data is used when the amount of data that an organization has to manage reaches a critical volume that requires new technological approaches in terms of storage, processing, and usage. Volume, velocity, and variety are usually the three criteria used to qualify a database as “big data.”

Proof of Concept (PoC): Is a realization of a certain method or idea in order to demonstrate its feasibility or a demonstration in principle with the aim of verifying that some concept or theory has practical potential. PoC represents a stage during the development of a product when it is established that the product will function as intended.

Machine Learning: A method of designing a sequence of actions to solve a problem that optimizes automatically through experience and with limited or no human intervention.

Data Analysis: This is a class of statistical methods that make it possible to process a very large volume of data and identify the most interesting aspects of its structure. Some methods help to extract relations between different sets of data, and thus, draw statistical information that makes it possible to describe the most important information contained in the data in the most succinct manner possible. Other techniques make it possible to group data in order to identify its common denominators clearly, and thereby understand them better.

Data Lake: Is a collection of storage instances of various data assets added to the originating data sources. These assets are stored in a near-exact, or even exact, a copy of the source format. The purpose of a data lake is to present an unrefined view of data to only the most highly skilled analysts, to help them explore their data refinement and analysis techniques independent of any of the system-of-record compromises that may exist in a traditional analytic data store (such as a data mart or data warehouse).

Complete Chapter List

Search this Book: