Particle Swarm Optimization (PSO) is popular for solving complex optimization problems. However, it easily traps in local minima. Authors modify the traditional PSO algorithm by adding an extra step called PSO-Shock. The PSO-Shock algorithm initiates similar to the PSO algorithm. Once it traps in a local minimum, it is detected by counting stall generations. When stall generation accumulates to a prespecified value, particles are perturbed. This helps particles to find better solutions than the current local minimum they found. The behavior of PSO-Shock algorithm is studied using a known: Schwefel's function. With promising performance on the Schwefel's function, PSO-Shock algorithm is utilized to optimize the weights and bias of Artificial Neural Networks (ANNs). The trained ANNs then forecast electricity consumption in Thailand. The proposed algorithm reduces the forecasting error compared to the traditional training algorithms. The percentage reduction of error is 23.81% compared to the Backpropagation algorithm and 16.50% compared to the traditional PSO algorithm.
Top1. Introduction
Current effective data harvesting tools have shifted the data scarcity issue we had a few decades back to new challenges where we need new methods or improve the existing methods to analyze those data. However, Artificial Neural Networks (ANNs) or ANN-based techniques are still powerful enough to work with those data, but slight modifications can be made to improve its performances. As a result, the architecture of the ANNs has evolved over the years from Feed Forward NN and will continue to change until we find a new method that can outperform ANNs.
Due to the highly nonlinear patterns appearing in the electricity consumption data, usage of ANNs to forecast electricity consumption is ample. The superior performances of ANNs over other methods to forecast future electricity consumption have been shown in many studies (Singh & Sahay, 2018; Chakravorty, Shah, & Nagraja, 2018). ANNs have both advantages and disadvantages (Hippert, Pedreira, & Souza, 2001). Computational complexity, the amount of data and time required during the training phase to learn the patterns, that performances depend on the random initialization of weights and bias, and that training is likely to stop in local minima are considered as the highlighted limitations of the ANNs.
In this study, we address one of these issues, where ANNs can trap in local minima during the training phase, by introducing a new training algorithm to train the ANNs. The conventional training algorithm to adjust the weights and bias in ANNs is Backpropagation. However, researchers have already found the above issue of Backpropagation and suggested alternative methods to train ANNs (Jeenanunta & Abeyrathn, 2017; Shayeghi, Shayanfar & Azimi, 2009; Jeenanunta & Abeyrathn, 2019). They highlight the importance of metaheuristic approaches for training ANNs. Particle Swarm Optimization (PSO) and Genetic Algorithm (GA) are popular among researchers due to their ability to solve complex nonlinear optimization problems (Subbaraj & Rajasekaran, 2019). PSO has been successfully applied to train ANNs and shown that it outperforms the traditional training algorithm, Backpropagation on a number of occasions (Jeenanunta & Abeyrathn, 2017; Das, 2017; Asar, Hassnain, & Khan, 2007; Mishra & Patra, 2008). Simultaneously, GA has also set better weights in ANNs compared to Backpropagation in electricity forecasting research (Mishra & Patra, 2008; Heng, Srinivasan & Liew, 1998).
Nevertheless, these metaheuristics are also unable to perform deftly when the complexity of the data pattern goes higher and ANNs have a large number of weight parameters to be optimized. However, the advantage of using metaheuristic approaches is that they can be adjusted to increase the performances. One of the examples is the PSO algorithm. Its performances can be further improved by changing it slightly or adding some extra steps. In our previous attempt (Abeyrathna & Jeenanunta, 2019), we introduced the PSO+GA algorithm. In that, we modified the PSO algorithm by introducing GA operations into the traditional PSO workflow. However, in this study, we make even simpler modifications to the traditional PSO algorithm in order to obtain on par or better performances while training ANNs. The new modifications are not only intelligible but also have computational advantages. The proposed algorithm is perfect for the considered application where it is used to train ANNs with complex data such as historical electricity consumption in Thailand.