PMCNS: Using a Progressively Stricter Fitness Criterion to Guide Novelty Search

PMCNS: Using a Progressively Stricter Fitness Criterion to Guide Novelty Search

Jorge Gomes (LabMAg, Faculdade de Ciências da Universidade de Lisboa, Lisboa, Portugal and Instituto de Telecomunicações, Lisboa, Portugal), Paulo Urbano (LabMAg, Faculdade de Ciências da Universidade de Lisboa, Lisboa, Portugal) and Anders Lyhne Christensen (Instituto Universitário de Lisboa (ISCTE-IUL), Lisboa, Portugal and Instituto de Telecomunicações, Lisboa, Portugal)
Copyright: © 2014 |Pages: 19
DOI: 10.4018/ijncr.2014040101
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

Novelty search is an evolutionary approach in which the population is driven towards behavioural innovation instead of towards a fixed objective. The use of behavioural novelty to score candidate solutions precludes convergence to local optima. However, in novelty search, significant effort may be spent on exploration of novel, but unfit behaviours. We propose progressive minimal criteria novelty search (PMCNS) to overcome this issue. In PMCNS, novelty search can freely explore the behaviour space as long as the solutions meet a progressively stricter fitness criterion. We evaluate the performance of our approach by evolving neurocontrollers for swarms of robots in two distinct tasks. Our results show that PMCNS outperforms fitness-based evolution and pure novelty search, and that PMCNS is superior to linear scalarisation of novelty and fitness scores. An analysis of behaviour space exploration shows that the benefits of novelty search are conserved in PMCNS despite the evolutionary pressure towards progressively fitter behaviours.
Article Preview

1. Introduction

Evolutionary algorithms typically assess the quality of candidate solutions by estimating how far they are from a fixed objective. The distance between a candidate solution and the objective is estimated by a pre-defined fitness function, and the closer an individual is to the objective, the higher fitness score it receives. While this traditional approach may appear effective, fitness-based evolutionary processes often get trapped in low-quality local optima (Whitley, 1991). A local optimum is a dead end in the search space from which the evolutionary process cannot easily escape. An evolutionary process may reach a local optimum because the fitness function does not necessarily reward the stepping stones in the search space that ultimately lead to the global objective. Fitness functions can, in fact, inadvertently deceive the evolutionary process and lead it in a wrong direction towards local optima (Mitchell et al., 1992). Such fitness functions are called deceptive fitness functions (Goldberg, 1987). The more ambitious the objective is, the harder it may be to define a non-deceptive fitness function (Zaera et al., 1996; Jones & Forrest, 1995).

Lehman and Stanley (2011a) recently proposed a radically different evolutionary approach called novelty search. In novelty search, candidate solutions are scored based on the novelty of their behaviour with respect to the behaviours of previously evaluated individuals, and not based on a traditional fitness function. Given the lack of static objective, novelty search avoids deception. The approach has been successfully applied to many different domains, including evolutionary robotics (Lehman & Stanley, 2011b; Mouret, 2011; Mouret & Doncieux, 2012; Gomes et al., 2013; Risi et al., 2009). Besides avoiding getting trapped in local optima, novelty search has been demonstrated capable of finding more diverse and less complex solutions than traditional fitness-based evolution (Lehman & Stanley, 2011a; Gomes et al., 2013).

As novelty search is guided by behavioural innovation alone, its performance can be greatly affected by the size of the behaviour space, because significant effort may be spent on behaviours that are irrelevant for the goal task (Lehman & Stanley, 2010; Cuccu & Gomez, 2011). To address this problem, Lehman and Stanley (2010) proposed minimal criteria novelty search (MCNS). MCNS is an extension of novelty search where individuals must meet some pre-defined minimal criteria to be selected for reproduction. In this way, MCNS combines novelty search with objective search as the minimal criteria are typically defined based on the goal task. In Lehman and Stanley (2010), the authors applied MCNS in two maze navigation tasks and demonstrated that MCNS evolved solutions more consistently than both novelty search and fitness-based evolution. However, MCNS suffers from two major drawbacks: it can be difficult to define suitable minimal criteria, and it may be necessary to bootstrap the evolutionary process with a genome specifically evolved to satisfy the criteria.

In recent work (Gomes et al., 2013), we successfully applied novelty search to the domain of evolutionary swarm robotics. In this paper, we extend our study by applying variants of novelty search that combine novelty and fitness. We introduce progressive minimal criteria novelty search (PMCNS), which extends MCNS in two ways: (i) PMCNS uses a fitness threshold as the minimal criterion, avoiding the necessity of specifying one or more criteria by hand; (ii) starting from the lowest possible fitness score, the fitness criterion is increased dynamically, thereby limiting novelty search to regions of the behaviour space with increasingly higher fitness. The magnitude of the increase from one generation to the next depends on the fitness profile of the current population.

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 6: 2 Issues (2017): 1 Released, 1 Forthcoming
Volume 5: 4 Issues (2015)
Volume 4: 4 Issues (2014)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing