Introduction to Fireworks Algorithm

Introduction to Fireworks Algorithm

Ying Tan, Chao Yu, Shaoqiu Zheng, Ke Ding
Copyright: © 2013 |Pages: 32
DOI: 10.4018/ijsir.2013100103
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Inspired by fireworks explosion at night, conventional fireworks algorithm (FWA) was developed in 2010. Since then, several improvements and applications were proposed to improve the efficiency of FWA. In this paper, the conventional fireworks algorithm is first summarized and three improved fireworks algorithms are provided. By changing the ways of calculating the numbers and amplitudes of sparks in fireworks' explosion, the improved FWA algorithms become more reasonable and explainable. In addition, the multi-objective fireworks algorithm and the graphic processing unit (GPU) based fireworks algorithm are also presented, particularly the GPU based fireworks algorithm is able to speed up the optimization process considerably. Extensive experiments on 13 benchmark functions demonstrate that the three improved fireworks algorithms significantly increase the accuracy of found solutions, yet decrease the running time dramatically. At last, some applications of fireworks algorithm are briefly described, while its shortcomings and future research directions are identified.
Article Preview
Top

1. Introduction

In most engineering fields, many problems can be simplified as numerical optimization problems through mathematical modeling. In some of the problems – not only the optimal solution, but also multiple feasible solutions and viable localized optimal solutions need to be identified to provide enough information for decision makers. Such problems are generally referred to as multi-modal and multi-objective optimization problems. To solve those problems, the maximum or the minimum values of the functions need to be found out within a limited time.

Traditional methods generally solve a continuous and differentiable function using mathematical techniques based on gradient information. However, when dealing with multi-modal and multi-objective optimization problems, traditional methods cannot always obtain reasonable solutions. In order to solve function optimization problems efficiently, many algorithms inspired by biological behavior have been suggested recently.

The study of biological phenomena is no longer constrained in the biology discipline alone, but expanded to mathematics, computer science, information science and other research fields. Inspired by the behavior of groups of animals, many swarm intelligence algorithms are designed in the field of computer science.

A swarm can be described as a number of individuals in adjacent areas and those individuals interact with each other. In nature, a bee, or an ant, or a bird can hardly survive without its kin. A group of organics, therefore, such as the aforementioned bees, ants or birds, has more chances to survive than the lone individual. The survival chance for a group is not a simple composition of each individual’s chance, but a more complex summary of social and group dynamics. The character of animal groups can greatly help its individuals adapt to their environment. Each individual obtains information from social interaction and that information gained by an individual in a group is more than the information any single individual can obtain alone. Information is then transferred among the group and each individual processes this transferred information and change its own behavior, including its own behavioral patterns and norms. Therefore, the whole group has some capabilities and characteristics, especially the ability to adapt to their environment that a single individual can hardly gain when working alone. The ability of an individual to change according with environment is known as intelligence and this intelligence is gained by the clustering of individuals.

Inspired by nature, many swarm intelligence algorithms are proposed. Observing the way of ants finding food, ant colony optimization (ACO) algorithm was proposed by Colorni and his partners in 1991 (Colorni, Dorigo, & Maniezzo, 1991). Moreover, particle swarm optimization (PSO) algorithm was put forward by Kennedy and Eberhart (1995). The algorithm mimics the pattern of birds flying to find food. Yet, differential evolution (DE) algorithm is another swarm intelligence algorithm, which was given by Storn and Price (1995). In this algorithm, the differences between individuals are fully utilized. The recently announced artificial bee colony algorithm (ABC) and fish school search algorithm (FSS), were proposed in 2008 and 2009 respectively (Karaboga, & Basturk, 2008; Bastos Filho, de Lima Neto, Lins, Nascimento, & Lima, 2009). The most recently proposed fireworks algorithm (FWA) is a swarm intelligence algorithm that was published by Tan and Zhu (2010). This algorithm is inspired by fireworks explosion at night and is quite effective at finding global optimal value. As a firework explodes, a shower of sparks is shown in the adjacent area. Those sparks will explode again and generate other shows of sparks in a smaller area. Gradually, the sparks will search the whole solution space in a fine structure and focus on a small place to find the optimal solution.

As a practical optimization algorithm, fireworks algorithm can fulfill three user requirements (Storn, & Price, 1997). First of all, FWA can process linear, non-linear and multi-model test functions. Secondly, FWA can be parallelized in order to deal with complex practical problems. Thirdly, FWA has good convergence properties and can always find the global minimization.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 3 Issues (2023)
Volume 13: 4 Issues (2022)
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing