A Preliminary Study on Adaptive Evolution Control Using Rank Correlation for Surrogate-Assisted Evolutionary Computation

A Preliminary Study on Adaptive Evolution Control Using Rank Correlation for Surrogate-Assisted Evolutionary Computation

Yudai Kuwahata, Jun-ichi Kushida, Satoshi Ono
Copyright: © 2018 |Pages: 14
DOI: 10.4018/IJSI.2018100105
(Individual Articles)
No Current Special Offers


This article describes how surrogate-assisted evolutionary computation (SAEC) has widely applied to approximate expensive optimization problems, which require much computational time such as hours for one solution evaluation. SAEC may potentially also reduce the processing time of inexpensive optimization problems wherein solutions are evaluated within a few seconds or minutes. To achieve this, the approximation model construction for an objective function should be iterated as few times as possible during optimization. Therefore, this article proposes an adaptive evolution control mechanism for SAEC using rank correlations between actually evaluated and approximately evaluated values of the objective function. These correlations are then used to adaptively switch the approximation and actual evaluation phases, reducing the number of runs required to learn the approximation model. Experiments show that the proposed method could successfully reduce the processing time in some benchmark functions even under inexpensive scenario.
Article Preview

2.1. Approximation Models

Various approximation models have been proposed for use in SAEC. Representative examples include polynomial approximation (Chen, Allen, Schrage & Mistree, 1997), kriging (Sacks, Welch, Mitchell & Wynn, 1989), radial basis function network (RBFN) (Broomhead & Lowe, 1988), and support vector regression (SVR) (Drucker, Burges, Kaufman, Smola & Vapnik, 1996). The choice of an appropriate model depends on the target application; kriging and SVM are appropriate for low dimensional problems and RBF for high dimensional problems. In this research, this paper uses RBFN as the approximate model as it is considered to be effective for application to high dimensional and inexpensive problems. However, the proposed approach is not limited to RBFN.

In RBFN, a response phase IJSI.2018100105.m01 is created by the superposition of scaled and translated basis functions. The response surface is calculated as follows:

(1) where IJSI.2018100105.m03 denotes the number of hidden layer elements and IJSI.2018100105.m04 denotes the weight. The function IJSI.2018100105.m05 is a basis function and a Gaussian function is widely used.

Let IJSI.2018100105.m06 and IJSI.2018100105.m07IJSI.2018100105.m08 be a pair of an input and teacher information to it. Learning in RBFN becomes the problem of finding a weight vector IJSI.2018100105.m09 that minimizes the following function:


In equation (2), the first term is the sum of the square errors between a network output and the teacher data, and IJSI.2018100105.m11 in the second term is a weight parameter that prevents an element from reacting excessively. This makes it possible to suppress the influence of overfitting to noise. The solution of IJSI.2018100105.m12 that satisfies equation (2) can be obtained from the following equation using a least squares method:

(3) where IJSI.2018100105.m14 is a matrix as follows:


IJSI.2018100105.m16 is a diagonal matrix whose diagonal elements are IJSI.2018100105.m17. Because the center of the basis function is assumed to lie on a sample point, the number of basis functions increases as a new sample points are added, and the response curve is then recalculated.

Complete Article List

Search this Journal:
Volume 11: 1 Issue (2023)
Volume 10: 4 Issues (2022): 2 Released, 2 Forthcoming
Volume 9: 4 Issues (2021)
Volume 8: 4 Issues (2020)
Volume 7: 4 Issues (2019)
Volume 6: 4 Issues (2018)
Volume 5: 4 Issues (2017)
Volume 4: 4 Issues (2016)
Volume 3: 4 Issues (2015)
Volume 2: 4 Issues (2014)
Volume 1: 4 Issues (2013)
View Complete Journal Contents Listing