Real world optimization problems are often too complex to be solved through analytic means. Evolutionary algorithms are a class of algorithms that borrow paradigms from nature to address them. These are stochastic methods of optimization that maintain a population of individual solutions, which correspond to points in the search space of the problem. These algorithms have been immensely popular as they are derivativefree techniques, are not as prone to getting trapped in local minima, and can be tailored specifically to suit any given problem. The performance of evolutionary algorithms can be improved further by adding a local search component to them. The Nelder-Mead simplex algorithm (Nelder & Mead, 1965) is a simple local search algorithm that has been routinely applied to improve the search process in evolutionary algorithms, and such a strategy has met with great success. In this article, we provide an overview of the various strategies that have been adopted to hybridize two wellknown evolutionary algorithms - genetic algorithms (GA) and particle swarm optimization (PSO).
Arguably, GAs are one of the most of all common population based approaches for optimization. The population of candidate solutions that these algorithms maintain in each generation are called chromosomes. GAs carry out the Darwinian operators of selection, mutation, and recombination, on these chromosomes, to perform their search (Mitchell, 1998). Each generation is improved by removing the poorer solutions from the population, while retaining the better ones, based on a fitness measure. This process is called selection. Following selection, a method of recombining solutions called crossover is applied. Here two (or more) parent solutions from the current generation are picked randomly for producing offspring to populate the next generation of solutions. The offspring chromosomes are then probabilistically subject to mutation, which is carried out by the addition of small random perturbations.
PSO is a more recent approach for optimization (Kennedy & Eberhart, 2001). Being modelled after the social behavior of organisms such as a flock of birds in flight or a school of fish swimming, it is considered an evolutionary algorithm only in a loose sense. Each solution within the population is called a particle in PSO. Each such particle’s position in the search space is constantly updated within each generation, by the addition of the particle’s velocity to it. The velocity of a particle is then adjusted towards the best position encountered in the particle’s own history (individual best), as well as the best position in the current iteration (global best).
Since evolutionary algorithms use a population of individuals and randomized variational operators, they are adept at performing exploratory searches over their search spaces. However, when the aim is to produce outputs within reasonable time limits, it is important to balance this exploration with better exploitation of smaller-scale features in the fitness landscape. In the latter context, local search algorithms enable single solutions to be improved using local information (e.g., directional trends in fitness around each solution) and take the solution towards the closest maximum fitness. Hybrid algorithms that combine the advantages of exploration and exploitation comprise of a distinct area of evolutionary computation research that have been variously called as Lamarckian or memetic approaches, of which Nelder-Mead hybrids are a significant chunk.
Key Terms in this Chapter
Search Space: Set of all possible solutions for any given optimization problem, in which one can usually define a neighborhood of any solution.
Fitness: A measure to determine the goodness of a solution to an optimization problem. When a single objective is to be maximized, the fitness is either equal to the objective or a monotonically increasing function of it.
Exploitation: A greedy strategy that seeks to improve one or more solutions to an optimization problem to take it to a maximum in its vicinity.
Evolutionary Algorithm: A class of probabilistic algorithms that are based upon biological metaphors such as Darwinian evolution, and widely used in optimization.
Local Search: A search algorithm to carry out exploitation.
Population-Based Algorithm: An algorithm, which maintains an entire set of candidate solutions for an optimization problem.
Generation: A term used in evolutionary algorithms that corresponds to an iteration of the outermost loop.
Exploration: A strategy that samples the fitness landscape extensively to obtain good regions.
Multi-Objective Optimization: An optimization problem involving more than a single objective function. In such a setting, it is not easy to discriminate between good and bad solutions, as a solution that is better than another in one objective may be poorer in another. Without any loss of generality, each objective function can be considered to be one involving maximization.
Dominance: A relationship between solutions in a multi-objective optimization problem. A solution dominates another if and only if it is equal to the latter in all the objectives and better in at least one.
Fitness Landscape: A representation of the search space of an optimization problem that brings out the differences in the fitness of the solutions, such that those with good fitness are “higher”. Optimal solutions are the maxima of the fitness landscape.