2.1. Approximation Models
Various approximation models have been proposed for use in SAEC. Representative examples include polynomial approximation (Chen, Allen, Schrage & Mistree, 1997), kriging (Sacks, Welch, Mitchell & Wynn, 1989), radial basis function network (RBFN) (Broomhead & Lowe, 1988), and support vector regression (SVR) (Drucker, Burges, Kaufman, Smola & Vapnik, 1996). The choice of an appropriate model depends on the target application; kriging and SVM are appropriate for low dimensional problems and RBF for high dimensional problems. In this research, this paper uses RBFN as the approximate model as it is considered to be effective for application to high dimensional and inexpensive problems. However, the proposed approach is not limited to RBFN.
In RBFN, a response phase
is created by the superposition of scaled and translated basis functions. The response surface is calculated as follows:
(1) where

denotes the number of hidden layer elements and

denotes the weight. The function

is a basis function and a Gaussian function is widely used.
Let
and 
be a pair of an input and teacher information to it. Learning in RBFN becomes the problem of finding a weight vector
that minimizes the following function:
(2)In equation (2), the first term is the sum of the square errors between a network output and the teacher data, and
in the second term is a weight parameter that prevents an element from reacting excessively. This makes it possible to suppress the influence of overfitting to noise. The solution of
that satisfies equation (2) can be obtained from the following equation using a least squares method:
(3) where

is a matrix as follows:
(4)
is a diagonal matrix whose diagonal elements are
. Because the center of the basis function is assumed to lie on a sample point, the number of basis functions increases as a new sample points are added, and the response curve is then recalculated.