Chaos-Enhanced Firefly Algorithm with Automatic Parameter Tuning

Chaos-Enhanced Firefly Algorithm with Automatic Parameter Tuning

Xin-She Yang (National Physical Lab, UK)
Copyright: © 2011 |Pages: 11
DOI: 10.4018/jsir.2011100101
OnDemand PDF Download:
No Current Special Offers


Many metaheuristic algorithms are nature-inspired, and most are population-based. Particle swarm optimization is a good example as an efficient metaheuristic algorithm. Inspired by PSO, many new algorithms have been developed in recent years. For example, firefly algorithm was inspired by the flashing behaviour of fireflies. In this paper, the author extends the standard firefly algorithm further to introduce chaos-enhanced firefly algorithm with automatic parameter tuning, which results in two more variants of FA. The author first compares the performance of these algorithms, and then uses them to solve a benchmark design problem in engineering. Results obtained by other methods will be compared and analyzed.
Article Preview

1. Introduction

Search for optimality in many optimization applications is a challenging task, and search efficiency is one of the most important measure for an optimization algorithm. In addition, an efficient algorithm does not necessarily guarantee the global optimality is reachable. In fact, many optimization algorithms are only efficient in finding local optima. For example, classic hill-climbing or steepest descent method is very efficient for local optimization. Global optimization typically involves objective functions which can be multimodal and highly nonlinear. Thus, it is often very challenging to find global optimality, especially for large-scale optimization problems. Recent studies suggest that metaheuristic algorithms such as particle swarm optimization are promising in solving these tough optimization problems (Kennedy & Eberhart, 1995; Kennedy et al., 2001; Shi & Eberhart, 1998; Eberhart & Shi, 2000; Yang, 2008).

Most metaheuristic algorithms are nature-inspired, from simulated annealing (Kirkpatrick et al., 1983) to firefly algorithm (Yang, 2008, 2010a), and from particle swarm optimization (Kennedy & Eberhart, 1995; Kennedy et al., 2001) to cuckoo search (Yang & Deb, 2010). These algorithms have been applied to almost all areas of optimization, design, scheduling and planning, data mining, machine intelligence, and many others (Gandomi et al., in press; Talbi, 2009; Yang, 2010a). On the other hand, chaotic tunneling is an important phenomenon in complex systems (Tomsovic, 1994; Podolskiy & Narmanov, 2003; Kohler et al., 1998; Delande & Zakrzewski, 2003; Shudo & Ikeda, 1998; Shudo et al., 2009). Traditional wisdom in optimization is to avoid numerical instability and chaos. Contemporary studies suggest that chaos can assist some algorithms such as genetic algorithms (Yang & Chen, 2002). For example, metaheuristic algorithms often use randomization techniques to increase the diversity of the solutions generated during search iterations (Talbi, 2009; Yang, 2010a). The most common randomization techniques are probably local random walks and Lévy flights (Gutowski, 2001; Pavlyukevich, 2007; Yang 2010b).

The key challenge for global optimization is that nonlinearity leads to multimodality, which in turns will cause problems to almost all optimization algorithms because the search process may be trapped in any local valley, and thus may cause tremendous difficulty to the search process towards global optimality. Even with most well-established stochastic search algorithms such as simulated annealing (Kirkpatrick et al., 1983), care must be taken to ensure it can escape the local modes/optimality. Premature convergence may occur in many algorithms including simulated annealing and genetic algorithms. The key ability of an efficient global search algorithm is to escape local optima, to visit all modes and to converge subsequently at the global optimality.

Complete Article List

Search this Journal:
Open Access Articles: Forthcoming
Volume 13: 4 Issues (2022): Forthcoming, Available for Pre-Order
Volume 12: 4 Issues (2021): 2 Released, 2 Forthcoming
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing