Implementing Negative Correlation Learning in Evolutionary Ensembles with Suitable Speciation Techniques

Implementing Negative Correlation Learning in Evolutionary Ensembles with Suitable Speciation Techniques

Peter Duell (The Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA), University of Birmingham, UK) and Xin Yao (The Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA), University of Birmingham, UK)
Copyright: © 2008 |Pages: 26
DOI: 10.4018/978-1-59904-807-9.ch016
OnDemand PDF Download:
$37.50

Abstract

Negative correlation learning (NCL) is a technique that attempts to create an ensemble of neural networks whose outputs are accurate but negatively correlated. The motivation for such a technique can be found in the bias-variance-covariance decomposition of an ensemble of learner’s generalization error. NCL is also increasingly used in conjunction with an evolutionary process, which gives rise to the possibility of adapting the structures of the networks at the same time as learning the weights. This chapter examines the motivation and characteristics of the NCL algorithm. Some recent work relating to the implementation of NCL in a single objective evolutionary framework for classification tasks is presented, and we examine the impact of two speciation techniques: implicit fitness sharing and an island model population structure. The choice of such speciation techniques can have a detrimental effect on the ability of NCL to produce accurate and diverse ensembles and should therefore be chosen carefully. This chapter also provides an overview of other researchers’ work with NCL and gives some promising future research directions.

Complete Chapter List

Search this Book:
Reset