Accelerated Cuckoo Search With Extended Diversification and Intensification

Accelerated Cuckoo Search With Extended Diversification and Intensification

Deepak Garg, Pardeep Kumar
Copyright: © 2021 |Pages: 24
DOI: 10.4018/IJSIR.2021070106
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Metaheuristics have been great to solve NP-hard class problems in the deterministic time, but due to so many parameter settings, they lack in generality (i.e., not easy to implement on all types of problems) and also lack in global search. But the cuckoo search (CS) algorithm has only one parameter as input and also has a good reachable probability to global solution due to Levy flight. But this algorithm lacks self-adaptive parameters and extended strategies. In this paper, a deep study and improvement of cuckoo search performance has been done by introducing self-adaptive step size, extended alien egg discovery replacement (on each dimension with the use of good neighbor study), and adaptive discovery probability, and it has been named accelerated cuckoo search (ACS). Then this ACS has been utilized as an example in the load balancing problem in cloud with minimum makespan time as an objective parameter to evaluate the performance of ACS over CS. Furthermore, to validate ACS superiority over CS in all problems, these have been successfully compared on a few benchmark functions.
Article Preview
Top

1. Introduction

Metaheuristics are nature-inspired algorithms, these are higher-level heuristics that don’t get easily stuck at local optima because of the use of intensification and diversification. Intensification helps in convergence while diversification helps to avoid premature convergence. The other main feature of metaheuristic is these are not problem-centered. Randomization is a very important feature in these algorithms. Randomization is an essential characteristic of heuristic. Random walk comprises independent random steps in random directions. A good metaheuristic should possess this randomness and intensification both in a good balance to give nearby global solutions.

Since now there have been several metaheuristic algorithms. One of the good ones was a Genetic Algorithm recommended by Holland et Al. which was based on evolutionary behavior (Holland et Al., 1975). In 1983, Kirkpatrick et. Al. developed SA (Simulated Annealing) which was inspired by the method of gradually cooling the molten metal to achieve a state of minimum energy (Kirkpatrick et Al., 1975). In 1986, Glover et. al. developed a Tabu search which was on the basis of the human memory mechanism (Glover et Al., 1986). In 1992, Dorigo developed ACO (Ant colony optimization) on behalf of wisely find the shortest route between home nest & location of food (Dorigo et Al., 1992). In 1992, Moscato et. Al. developed a memetic approach inspired by memes (Moscato et Al., 1992). In 1995, Kennedy et. Al. developed PSO (Particle Swarm Optimization) which was on the foundation of flying of birds group without collision (Kennedy et Al., 1995). Passino et. Al. used BFO (Bacterial foraging optimization) on the basis of the movement behavior of bacteria for nutrients (Pssino et Al., 2002). Eusuff et. Al. developed SFLA (Shuffled Frog Leaping Algorithm) which was on the foundation of jumping between groups and randomly distributing behavior of the frogs to interchange facts among rest frogs to quest for the food (Eusuff et Al., 2003). Karaboga et. Al. developed ABC (Artificial Bee Colony Algorithm) on the foundation of searching behavior of the honey bees for food (karaboga et Al., 2005). Yang et. Al. developed FFA (Firefly Algorithm) which was motivated by the flashing or glowing nature of fireflies for the mating and for food search (Yang et Al., 2010; Yang et Al., 2013). Simon et Al. developed BBO (Biogeography Based Optimization) which was motivated by the migration nature of the species from one habitat to another (Simon et Al., 2008). Yang et. Al. proposed and used Cuckoo Search Algorithm (CSA) (Yang et Al., 2009; Yang et Al., 2010; Yang et Al., 2014). CSA is on the basis of the breeding nature of the cuckoo bird. Yang et. Al. developed Bat Algorithm which mimicked the bats' echolocation behavior as the bats produce high frequency to search for prey and for navigation purposes (Yang et Al., 2013). In 2013, Yang et Al. proposed FPA (Flower Pollination Algorithm) which was enthused by fertilization (pollination) method of flowers (Yang et Al., 2013). After this many algorithms have been proposed but they all have more input parameters that need to set for every problem, thus not generally applicable. While due to only one parameter i.e. discovery probability and levy flight, cuckoo search has been applied successfully to many unimodal and multimodal with good results (Yang et Al., 2010; Han et Al., 2015; Li et Al., 2015; Ali et Al., 2016; Huang et Al., 2016; Li et Al., 2016; Wang et Al., 2016; Han et Al., 2017; Rakhshani et Al., 2017).

1.1 Our Contribution

Several studies showed that CS may not assure fast convergence because of random walk and some constant internal parameters (Valian et Al., 2013; Singh et Al., 2014; Wang et Al., 2016). Furthermore, a small or constant step or parameter may cause CSA to get stuck on local optima. To improve this, accelerated cuckoo search has been proposed with self-adaptive step size, extended alien egg discovery replacement (on each dimension & use of good neighbor study), adaptive discovery probability. Three types of random walks have been evaluated with the above modification to improve Makespan time in Cloud for load balancing as an example and also validated the superiority of ACS over CS for a few benchmark functions.

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 3 Issues (2023)
Volume 13: 4 Issues (2022)
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing