Large-Scale LP in Business Analytics

Large-Scale LP in Business Analytics

William Chung (City University of Hong Kong, Hong Kong)
Copyright: © 2014 |Pages: 10
DOI: 10.4018/978-1-4666-5202-6.ch126
OnDemand PDF Download:
$30.00
List Price: $37.50

Chapter Preview

Top

Introduction

Although the definition of business analytics (BA) continues to change, BA generally involves big data and statistical and quantitative analyses to describe and investigate business performance. Results of the analyses are expected to provide new insights on future business planning by establishing several predictive models, which are also based on big data. Future business planning includes numerous operations and supply chain problems and solutions, such as demand forecasting, capacity planning, workforce planning, revenue management, inventory management, logistics analysis, and routing associated with prescription goals when the problem is optimization. Manyika, Chui, Brown, Bughin, Dobbs, Roxburgh, and Byers (2011) identified several merchandising problems, such as assortment optimization and pricing optimization. With the availability of big data, the aforementioned old-fashioned problems and solutions become new areas of research that might require the development of new and sophisticated BA approaches and methods. It is because big data generally refers to data that is so large and complex that they create significant challenges for traditional data management and analysis tools. For example, Weier (2007) reported that Wal-Mart handles 800 million transactions generated by its 30 million customers each day. Another chain store utilizes its data systems to handle a database table of more than four billion rows. Kiron and Shockley (2011) mentioned that the next big data measure after zettabyte (1021 byte) is yottabyte (1024 byte). Billions of years is required to download a yottabyte file at current high-speed broadband speeds. If Internet traffic continues to grow at current rates, we will likely approach the yottabyte milestone before the end of this century. IBM Big Data (2013) mentioned that 2.5 quintillion bytes of data are generated every day; 90% of the data in the world today has been generated in the last two years alone. This data can be considered as a kind of big data. Hence, the ability to store and retrieve big data is believed to be only a small piece of the competitive puzzle. Through several BA techniques, such as data mining and statistical prediction models, all successful retail Web sites can suggest additional purchases at checkout. Clearly, the size of the big data of companies is not the unique key factor to distinguish the winners. One of other key factors is the ability of companies to utilize optimization methods with big data to optimize resource utilization and make decisions, and one of the optimization methods is linear programming (LP). In this chapter, we discuss the application of large-scale LP from the BA perspective.

Key Terms in this Chapter

Cloud Computing: Cloud computing generally refers to the use of computing resources (including hardware and software) that are delivered as services (such as servers, storage, and applications) over the Internet to an organization’s computers and devices.

Large-Scale Linear Programming: Large-scale linear programming consists of a very large number of decision variables (columns of the constraint matrix) and constraints (rows of the constraint matrix) that creates significant challenges, such as long solution time and restriction of available memory, in terms of obtaining solutions.

Conditional Value at Risk: Conditional value at risk is a risk measure derived by taking a weighted average between the value at risk and losses exceeding the value at risk. The value at risk is a widely used in financial risk management to measure the potential loss in value of a risky asset or portfolio over a defined period for a given confidence interval. Conditional value at risk is also called average value at risk and expected tail loss.

Business Analytics: Business analytics utilizes big data and statistical and quantitative analyses to describe and investigate business performance and optimum planning.

Big Data: Big data, obtained automatically by advanced Web technology, digital sensors, and so forth, refers to data so large and complex that they create significant challenges for data management and analysis tools.

Dantzig–Wolfe Decomposition: Dantzig-Wolfe decomposition is a solution method for solving large-scale linear programming with special structure in which the constraint matrix has a specific form, like block-angular constraint matrix.

Scenario-Based Linear Programming: Scenario-based linear programming is a type of stochastic linear programming in which the decision variables and constraints are determined by the number of samples or scenarios created. Normally, the samples or scenarios are retreived from big data.

Complete Chapter List

Search this Book:
Reset