An Effective Solution to Regression Problem by RBF Neuron Network

An Effective Solution to Regression Problem by RBF Neuron Network

Dang Thi Thu Hien, Hoang Xuan Huan, Le Xuan Minh Hoang
DOI: 10.4018/IJORIS.2015100104
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Radial Basis Function (RBF) neuron network is being applied widely in multivariate function regression. However, selection of neuron number for hidden layer and definition of suitable centre in order to produce a good regression network are still open problems which have been researched by many people. This article proposes to apply grid equally space nodes as the centre of hidden layer. Then, the authors use k-nearest neighbour method to define the value of regression function at the center and an interpolation RBF network training algorithm with equally spaced nodes to train the network. The experiments show the outstanding efficiency of regression function when the training data has Gauss white noise.
Article Preview
Top

1. Introduction

Multivariable function regression is a traditional and very important problem in numerical analysis which is widely applied (Alpaydin, 2010; Bartels, Beatty, & Barsky, 1987; Bromhead & Lowe, 1988; Collatz, 1966; Park & Sandberg, 1993; Tomohiro, Sadanori, & Seiya, 2008). When no concerning on noise distribution characteristics, this problem is stated and applied through interpolation and function approximation problems. 1-D case was researched and solved by Lagrangge and Chebyshev by using polynomial as regression function. Since mid of 20th century till now, with research development and application of machine learning, image processing, computer graphic and technical problems, the regression problem has been also attracting many people to study. Among of them, selection of regression function form and good definition method is still an interested prime research topic for researchers (Blanzieri, 2003; Huan, Hien, & Huu-Tue, 2007; Schwenker, Kesler, & Gunther, 2001; Tomohiro et al., 2008).

During 3 recent decades, MLP (Multiple-Layered Perceptron), RBF (Radial Basis Function) neuron networks are effective tools to solve this problem in applications (Blanzieri, 2003; Haykin, 1999; Huan, Hien, & Huu-Tue, 2011; Looney, 1997; Rudenko & Bezsonov, 2011).

RBF regression method was proposed by Powell, introduced by Broomhead and Lowe as a neuron network (Powell, 1988; Bromhead et al., 1988). In comparison to MLP neuron network, RBF neuron network (hereinafter called RBF network) has short training period and is suitable for Regression problems. Training process of RBF network includes: 1) Defining the number of neurons in hidden layer and corresponding centre; 2) defining radius parameters of hidden neurons and the weight of output layer, in which definition of suitable neuron number in hidden layer, and the centre and radius parameters to produce a good Regression function is still an open problem (Powell, 1988; Blanzieri, 2003; Fasshauer, 2007; Pérez-Godoy, Rivera, Carmona, & del Jesus, 2014; Schwenker et al., 2001; Tomohiro et al., 2008; Weruaga & Via, 2014). The authors usually base on interpolation nodes’ distribution characteristics o determine center and radius parameter (Guang-Bin Huang, Saratchandran, & Sundararajan, 2004; Pérez-Godoy et al., 2014; Tomohiro et al., 2008; Sum, Chi-Sing Leung, & Ho, 2009).

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 14: 1 Issue (2023)
Volume 13: 2 Issues (2022)
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing