Meta Heuristic Approach for Automatic Forecasting Model Selection

Meta Heuristic Approach for Automatic Forecasting Model Selection

Shoban Babu (Capability Network, Accenture Management Consulting, Bangalore, India) and Mitul Shah (Accenture Management Consulting, Gurgaon, India)
DOI: 10.4018/jisscm.2013040101
OnDemand PDF Download:
No Current Special Offers


Selection of appropriate forecasting models with their optimized parameters for a given business scenario is a challenging task and requires reasonable expert knowledge and experience. The problem of selecting the best forecasting model becomes computationally complex when the business needs forecasts on thousands of time series at a given time period. Many a times business users are interested in adapting the best parameter settings of proven forecasting models of the past and use them for further predictions. This approach facilitates the users to identify the forecasting model with a parameter value which minimizes the average of forecast errors across all the time series. This paper proposes a genetic algorithm based solution approach which simultaneously suggests the suitable forecasting model and its best parameter(s) value which minimizes the average mean absolute percentage error of all the time series. This approach is tested on randomly generated data sets and the results are compared with few randomly selected samples. For a fair comparison the samples are tested in SAS 9.1 and the results are compared with sample results which used GA suggested forecasting model and parameter values.
Article Preview


Demand forecasts are very important for efficient supply chain planning. Demand forecasts information sharing between supply chain members may improve supply chain performance (Yan & Ghose, 2008; Yan & Wang, 2009; Posey & Bari, 2009). Accurate forecasts further improve the performance of the supply chain. Many forecasting models ranging from simple averaging models to complex smoothing models to regression models are used by the practitioners in various domains (Patino et al., 2010; Kuruvilla & Alexander, 2010). The performance of these models depends on the careful tuning of the parameters associated with them. In general, the selection of appropriate model and tuning their parameters for a specific forecasting scenario requires reasonable expert knowledge and experience. Moreover this task becomes very complex and computationally tedious when there is a need for forecasting multiples of time series at a given point of time.

There are four factors which influence the model selection (Makridakis et al., 2003): the data, the characteristics of the data, the type (length between successive values) of data, and the number and frequency of forecasts required. A greater number of predictions are needed when forecast on a daily rather than on a monthly or quarterly basis. Thus, the forecasting models considered can include statistically sophisticated ones, which require a lot of human data to build and human inputs to operate.

Automatic forecasting is usually defined as forecasting without the aid of an analyst skilled in time series analysis techniques or when the number of forecasts is too huge for an analyst to investigate. Automatic forecasting is usually performed on each time series independently. For each time series and for each candidate model, the parameter estimates (weights) should be optimized for best results. This means that several optimizations may be required for each time series. But estimating an appropriate parameter setting for a given forecasting scenario is computationally tedious. Basically the Parameter Setting Problem (PSP) optimizes parameters of a known optimization problems. But in practice, we tend to use or adapt the parameters to solve the unknown or out of sample optimization problems and it is called as a Parameter Adaptation Problem (PAP).

The PSP and PAP may consider only one forecasting model at a time. In general, practitioners use number of forecasting models and use the appropriate one with its optimized parameter values for a given period of time. The Multiple Forecasting Models Parameter Adaptation Problem (MFMPAP) involves multiple forecasting models. The basic assumption in parameter adaptation problem is, if a parameter setting works well for the sample problems, then it should also deliver good results for the out of sample problems. Also, the sample and out of sample problems are drawn from the same class of problems.

In order to solve the MFMPAP, a Genetic Algorithm (GA) is designed and used in this study. Genetic Algorithms (Holland, 1975) are search algorithms based on the mechanics of natural selection and natural genetics (Goldberg, 2001). They combine survival of the fittest among string structures with a structured yet randomized information exchange to form a search algorithm with some of the innovative flair of human search. GAs have been successfully applied in different problem scenarios such as Vehicle Routing (Machado et al., 2002), Job Shop Scheduling (Ono et al., 1996), Circuit Layout (Mazumder & Rudnick, 1998) and Supply Chain design (Dhanalakshmi et al., 2009).

In this paper, a GA is designed to solve the Multiple Forecasting Models Parameters Adaptation Problem (MFMPAP) to optimize the model parameters and select the appropriate forecasting model for a given forecasting scenario. The genetic algorithm is designed in such a way that it can consider any number of forecasting models. In this paper, for simplicity and to reduce the experiment time, three forecasting models are considered: single exponential smoothing (SES), double exponential smoothing (DES) and Winter’s triple exponential smoothing (WTES). The description of these algorithms and the parameters associated with them are discussed in the following Sections.

Complete Article List

Search this Journal:
Volume 15: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 14: 4 Issues (2021)
Volume 13: 4 Issues (2020)
Volume 12: 4 Issues (2019)
Volume 11: 4 Issues (2018)
Volume 10: 4 Issues (2017)
Volume 9: 4 Issues (2016)
Volume 8: 4 Issues (2015)
Volume 7: 4 Issues (2014)
Volume 6: 4 Issues (2013)
Volume 5: 4 Issues (2012)
Volume 4: 4 Issues (2011)
Volume 3: 4 Issues (2010)
Volume 2: 4 Issues (2009)
Volume 1: 4 Issues (2008)
View Complete Journal Contents Listing