A Hybrid Meta-Heuristic Method Based on Firefly Algorithm and Krill Herd

A Hybrid Meta-Heuristic Method Based on Firefly Algorithm and Krill Herd

Gai-Ge Wang (Jiangsu Normal University, China), Amir H. Gandomi (Michigan State University, USA), Amir H. Alavi (Michigan State University, USA) and Yong-Quan Dong (Jiangsu Normal University, China)
DOI: 10.4018/978-1-4666-9479-8.ch019
OnDemand PDF Download:
No Current Special Offers


This study proposes a new firefly-inspired krill herd (FKH) optimization method based on integration of firefly and krill herd algorithms. FKH introduces an attractiveness and light intensity updating (ALIU) operator originally used in firefly algorithm into the krill herd method. This is basically done to improve local search technique and promote the diversity of the population to avoid a premature convergence. Moreover, an elitism strategy is adopted to maintain the optimal krill with the best fitness when updating the krill. The performance of the FKH method is verified using fifteen different benchmark functions. The results indicate that FKH performs more accurate and effective than the basic krill herd and other optimization algorithms.
Chapter Preview

1. Introduction

In computer science, mathematics, and control theory, optimization is to look for minimum or maximum within the search space. Up to the present, numerous techniques have been proposed and used to solve these optimization tasks. Categorization of these optimization techniques can be performed in many ways. However, a common way for classifying these optimization methods is to consider the feature of the techniques, and they can be classified as two main parts: traditional methods, and intelligent algorithms (Wang et al., 2014g). Traditional methods follow a strict step, and for the equivalent initial starting point, they will follow the same path and the final set of solutions is repeatable. However, intelligent algorithms always have some randomness, and the final solutions will be various each time even with the identical initial value. But, in most cases, both traditional methods and intelligent algorithms will eventually converge to the similar values, though slightly different (Wang et al., 2014g). With the recent development of the theorem of statistics and artificial intelligence, many new meta-heuristic methods have emerged to complete function optimization. Nature-inspired meta-heuristic methods implement effectively when solving complex numerical optimization problems. All meta-heuristic methods make an attempt at making trade-off between intensification/exploitation (local search) and diversification/exploration/randomization (global search) (Yang, 2010a).

Inspired by nature, these meta-heuristic methods have been developed to solve complex engineering problems, such as dynamic WSN deployment (Wang et al., 2012a), optimization (Mirjalili et al., 2014c), multi-objective optimization (Li and Yin, 2013a; Mirjalili et al., 2013b), flow shop scheduling (Li and Yin, 2013b), and engineering optimization problems (Gandomi et al., 2013b; Yang et al., 2013). These kinds of meta-heuristic approaches can fully use population information and almost search for optimal or sub-optimal solutions. The first gradient-free optimization method, called genetic algorithm (GA) (Goldberg, 1998), was originally proposed after the in-depth study of the evolutionary theory in the 1960s and 1970s. Recently, various techniques are designed for optimization tasks, like cuckoo search (CS) (Li et al., 2013; Wang et al., 2015b; Wang et al., 2014e; Wang et al., 2012c; Yang and Deb, 2009), biogeography-based optimization (BBO) (Li et al., 2011; Li and Yin, 2012b; Mirjalili et al., 2014b; Saremi et al., 2014a; Simon, 2008), artificial bee colony (ABC) (Karaboga and Basturk, 2007; Li and Yin, 2012c), genetic programming (GP) (Gandomi and Alavi, 2011), stud GA (SGA) (Khatib and Fleming, 1998b; Wang et al., 2014b), differential evolution (DE) (Gao et al., 2009; Li and Yin, 2014; Storn and Price, 1997; Wang et al., 2014c; Wang et al., 2012b; Zou et al., 2011b), ant lion optimizer (ALO) (Mirjalili, 2015a), chicken swarm optimization (CSO) (Meng et al., 2014), wolf search algorithm (WSA) (Fong et al., 2015), multi-verse optimizer (MVO) (Mirjalili et al., 2015), earthworm optimization algorithm (EWA) (Wang et al., 2015a), grey wolf optimizer (GWO) (Mirjalili et al., 2014a; Saremi et al., 2014b), dragonfly algorithm (DA) (Mirjalili, 2015b), harmony search (HS) (Geem et al., 2001; Wang et al., 2013a; Zou et al., 2011a; Zou et al., 2010a; Zou et al., 2010b), bird swarm algorithm (BSO) (Meng et al., 2015), moth-flame optimization (MFO) (Mirjalili, 2015c), animal migration optimization (AMO) (Li et al., 2014), particle swarm optimization (PSO) (Kennedy and Eberhart, 1995; Mirjalili et al., 2014d; Wang et al., 2014d; Zhao et al., 2015), ant colony optimization (ACO) (Dorigo and Stutzle, 2004), bat algorithm (BA) (Cai et al., 2014; Gandomi et al., 2013a; Mirjalili et al., 2013a; Xue et al., 2015; Yang and Gandomi, 2012), and especially, the KH algorithm (Gandomi and Alavi, 2012; Wang et al., 2015c; Wang et al., 2014i) inspired by the herding behavior of krill individuals (Wang et al., 2013b).

Complete Chapter List

Search this Book: