Genetic algorithms (GAs) (Holland, 1975; Goldberg, 1989) try to find the solution for a problem using an initial group of individuals?the population?where each one represents a potential solution. Actually they are successfully applied in very different and actual fields (Yang, Shan, & Bui, 2008; Yu, Davis, Baydar, & Roy, 2008); nevertheless, GAs have some restrictions on a search space with more than a global solution or a unique global solution, together with multiple local optima. A classical GA faced with such a situation tends to focus the search on the surroundings of the global solution; however, it would be interesting to know a higher number of possible solutions for several reasons: precise information about the search space, easy implementation of the local solutions compared with the global one, simple interpretation of certain solutions compared with others, and so forth. To achieve that knowledge, an iterative process will be executed until reaching the desired goals. Such process will start with the grouping of the individuals into species that will independently search a solution in their environments; following, the crossover operation will involve individuals from different species in order not to leave unexplored any search space area. The process will be repeated according to the goals achieved.
There are problems that do not exclusively have a unique global solution, but they have multiple optima, either global or local: the multimodal problems (Ehrgott, 2005).
For dealing with such type of problems, it is interesting to know the higher possible number of solutions. On one hand, the knowledge about the problem might not be complete; this fact leads to the uncertainty about the goodness of the obtained solution, as it cannot be guaranteed that no better solutions might be found at the unexplored search space. On the other hand, and even achieving the best solution, there might be other possible solutions that, due to different reasons (economy, simplicity), might be preferable.
Brief Introduction To Genetic Algorithms
GAs are adaptive methods, generally used in problems of search and of parameter optimization, based on sexual reproduction and on the “survival of the fittest” theory (Tomassini, 1995; Beasley, Bull, & Martin, 1993; De Jong, 2002).
A population, a group of individuals where each one represents a potential solution that will evolve through different generations, is initially created. The best individual would tend to be kept after several evolutions, but other less-fitted individuals will be also kept in order to keep diversity. The diversity will enable that there might be individuals with different characteristics that could, some of them, be suitably adapted to the eventual changes on the environment.
The best-fitted natural individuals are those that have more possibilities of having descendants, following the natural selection principles proposed of Darwin (1859).
In nature, individuals usually establish different groups, each of them specialized in different tasks: hunting, harvesting, and so forth. But the traditional GAs do not envisage this possibility for reaching several solutions; the present work will study an extension that will bear this in mind.
Any potential solution to a problem can be represented by providing values for a series of parameters. The whole of these parameters (or genes) is codified by a strand of values: the chromosome. The encoding is usually done by means of binary values, although other representations can also be used (see Figure 1).
Key Terms in this Chapter
Evolutionary Technique: Technique that tries to provide valid solutions for a specific problem using concepts taken from nature or biology, such as survival of fittest.
Niching: Separation of individuals according to their states in the search space or maintenance of diversity by appropriate techniques, for example, local population models, fitness sharing, or distributed evolutionary algorithms.
Genetic Algorithm: A special type of evolutionary technique where the potential solutions are represented by means of chromosomes (usually either binary or real sets of values). Each gene (or set of genes) represents a variable or parameter within the global solution.
Fitness: Value derived from the evaluation of an individual with respect to its reproduction capability. This measure determines the goodness of the solution encoded by an individual. Usually, selection in evolutionary algorithms depends on the fitness.
Search Space: Set of all possible situations of the problem that want to be solved. Combination of all the possible values for all the variables involved with the problem.
Diversity: Measure of the genotypic difference between different individuals. It is necessary to keep a high ratio of diversity to explore in depth the search space and avoid a premature gene convergence due to the genetic drift.
Multimodal Problem: Special kind of problem with several global solutions or one global solution with several local peaks instead of a unique global optimum.
Phenotype: Expression of the properties coded by the individual’s genotype. The precise definition of phenotypes is mostly problem dependent. For parameter optimization the phenotype is usually identical with the object parameters, whereas for structure optimization (e.g., of neural networks) the phenotype represents a specific structure (in this case, the genotype represents the value parameters of the structure).
Species: Within the context of genetic algorithms, subset of genetic individuals with similar genotypes (genetic values) that explores a similar area in the search space.
Genotype: Set of values (real, binary, …) that codifies the internal solution representation to which the crossover and mutation operators are applied.