Multi-Objective Evolutionary Algorithms

Multi-Objective Evolutionary Algorithms

Sanjoy Das, Bijaya K. Panigrahi
Copyright: © 2009 |Pages: 7
DOI: 10.4018/978-1-59904-849-9.ch167
OnDemand:
(Individual Chapters)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Real world optimization problems are often too complex to be solved through analytical means. Evolutionary algorithms, a class of algorithms that borrow paradigms from nature, are particularly well suited to address such problems. These algorithms are stochastic methods of optimization that have become immensely popular recently, because they are derivative-free methods, are not as prone to getting trapped in local minima (as they are population based), and are shown to work well for many complex optimization problems. Although evolutionary algorithms have conventionally focussed on optimizing single objective functions, most practical problems in engineering are inherently multi-objective in nature. Multi-objective evolutionary optimization is a relatively new, and rapidly expanding area of research in evolutionary computation that looks at ways to address these problems. In this chapter, we provide an overview of some of the most significant issues in multi-objective optimization (Deb, 2001).
Chapter Preview
Top

Background

Arguably, Genetic Algorithms (GAs) are one of the most common evolutionary optimization approaches. These algorithms maintain a population of candidate solutions in each generation, called chromosomes. Each chromosome corresponds to a point in the algorithm’s search space. GAs use three Darwinian operators − selection, mutation, and crossover to perform their search (Mitchell, 1998). Each generation is improved by systematically removing the poorer solutions while retaining the better ones, based on a fitness measure. This process is called selection. Binary tournament selection and roulette wheel selection are two popular methods of selection. In binary tournament selection, two solutions, called parents, are picked randomly from the population, with replacement, and their fitness compared, while in roulette wheel selection, the probability of a solution to be picked, is made to be directly proportional to its fitness.

Following selection, the crossover operator is applied. Usually, two parent solutions from the current generation are picked randomly for producing offspring to populate the next generation of solutions. The offspring are created from the parent solutions in such a manner that they bear characteristics from both. The offspring chromosomes are probabilistically subject to another operator called mutation, which is the addition of small random perturbations. Only a few solutions undergo mutation. Evolutionary Strategies (ES) forms another class of evolutionary algorithms that is closely related to GAs and uses similar operators as well.

Particle Swarm Optimization (PSO) is a more recent approach (Clerc, 2005). It is modeled after the social behavior of organisms such as a flock of birds or a school of fish, and thus only loosely classified as an evolutionary approach. Each solution within the population in PSO, called a particle, has a unique position in the search space. In each generation, the position of each particle is updated by the addition of the particle’s own velocity to it. The velocity of a particle, a vector, is then incremented towards best location encountered in the particle’s own history (called the individual best), as well as the best position in the current iteration (called the global best).

Key Terms in this Chapter

Elitism: A strategy in evolutionary algorithms where the best one or more solutions, called the elites, in each generation, are inserted into the next, without undergoing any change. This strategy usually speeds up the convergence of the algorithm. In a multi-objective framework, any non-dominated solution can be considered to be an elite

Fitness Landscape: A representation of the search space of an optimization problem that brings out the differences in the fitness of the solutions, such that those with good fitness are “higher”. Optimal solutions are the maxima of the fitness landscape

Multi-Objective Optimization: An optimization problem involving more than a single objective function. In such a setting, it is not easy to discriminate between good and bad solutions, as a solution, which is better than another in one objective, may be poorer in another. Without any loss of generality, any optimization problem can be cast as one involving minimizations only

Population Based Algorithm: An algorithm that maintains an entire set of candidate solutions, each solution corresponding to a unique point in the search space of the problem

Generation: A term used in evolutionary algorithms that roughly corresponds to each iteration of the outermost loop. The offspring obtained in one generation become the parents of the next.

Evolutionary Algorithm: A class of probabilistic algorithms that are based upon biological metaphors such as Darwinian evolution, and widely used in optimization

Objective Function: The function that is to be optimized. In a minimization problem, the fitness varies inversely as the objective function

Search Space: Set of all possible solutions for any given optimization problem. Almost always, a neighborhood around each solution can also be defined in the search space

Fitness: A measure that is used to determine the goodness of a solution for an optimization problem.

Complete Chapter List

Search this Book:
Reset