Complex-Valued Symmetric Radial Basis Function Network for Beamforming

Complex-Valued Symmetric Radial Basis Function Network for Beamforming

Sheng Chen (University of Southampton, UK)
DOI: 10.4018/978-1-60566-214-5.ch007
OnDemand PDF Download:


The complex-valued radial basis function (RBF) network proposed by Chen et al. (1994) has found many applications for processing complex-valued signals, in particular, for communication channel equalization and signal detection. This complex-valued RBF network, like many other existing RBF modeling methods, constitutes a black-box approach that seeks typically a sparse model representation extracted from the training data. Adopting black-box modeling is appropriate, if no a priori information exists regarding the underlying data generating mechanism. However, a fundamental principle in practical data modelling is that if there exists a priori information concerning the system to be modeled it should be incorporated in the modeling process. Many complex-valued signal processing problems, particularly those encountered in communication signal detection, have some inherent symmetric properties. This contribution adopts a grey-box approach to complex-valued RBF modeling and develops a complex-valued symmetric RBF (SRBF) network model. The application of this SRBF network is demonstrated using nonlinear beamforming assisted detection for multiple-antenna aided wireless systems that employ complex-valued modulation schemes. Two training algorithms for this complex-valued SRBF network are proposed. The first method is based on a modified version of the cluster-variation enhanced clustering algorithm, while the second method is derived by modifying the orthogonal-forward-selection procedure based on Fisher ratio of class separability measure. The effectiveness of the proposed complex-valued SRBF network and the efficiency of the two training algorithms are demonstrated in nonlinear beamforming application.
Chapter Preview


The radial basis function (RBF) network is a popular artificial neural network (ANN) architecture that has found wide-ranging applications in many diverse fields of engineering, see for example, (Chen et al., 1990; Leonard & Kramer, 1991; Chen et al., 1993; Caiti & Parisini, 1994; Gorinevsky et al., 1996; Cha & Kassam, 1996; Rosenblum & Davis, 1996; Refaee et al., 1999; Muraki et al., 2001; Mukai, et al., 2002; Su et al., 2002; Li et al., 2004; Lee & Choi, 2004; Ng et al., 2004; Oyang et al., 2005; Acir et al., 2005; Tan et al., 2005). The RBF method is a classical numerical technique for nonlinear functional interpolation with real-valued data (Powell, 1987). A renewed interest in the RBF method coincided with a recent resurgence in the field of ANNs. Connections between the RBF method and the ANN was made and the RBF model was re-interpreted as a one-hidden-layer feedforward network (Broomhead & Lowe, 1988; Poggio & Girosi, 1990). Specifically, by adopting the ANN interpretation, a RBF model can be considered as a processing structure consisting of a hidden layer and an output layer. Each node in the hidden layer has a radially symmetric response around a node parameter vector called a centre, with the hidden node’s response shape determined by the chosen basis function as well as a node width parameter, while the output layer is a set of linear combiners with linear connection weights.

The parameters of the RBF network include its centre vectors and variances or covariance matrices of the basis functions as well as the weights that connect the RBF nodes to the network output. All the parameters of a RBF network can be learned together via nonlinear optimisation using the gradient based algorithms (Chen et al., 1990a; An et al., 1993; McLoone et al., 1998; Karayiannis et al., 2003; Peng et al., 2003), the evolutionary algorithms (Whitehead & Choate, 1994; Whitehead, 1996; Gonzalez et al., 2003) or the expectation-maximisation algorithm (Yang & Chen, 1998; Mak & Kung, 2000). Generally, learning based on such a nonlinear approach is computationally expensive and may encounter the problem of local minima. Additionally, the network structure or the number of RBF nodes has to be determined via other means, typically based on cross validation. Alternatively, clustering algorithms can be applied to find the RBF centre vectors as well as the associated basis function variances (Moody & Darken, 1989; Chen et al., 1992; Chen, 1995; Uykan, 2003). This leaves the RBF weights to be determined by the usual linear least squares solution. Again, the number of the clusters has to be determined via other means, such as cross validation. One of the most popular approaches for constructing RBF networks however is to formulate the problem as a linear learning one by considering the training input data points as candidate RBF centres and employing a common variance for every RBF node. A parsimonious RBF network is then identified using the orthogonal least squares (OLS) algorithm (Chen et al., 1989; Chen et al., 1991; Chen et al., 1999; Chen et al., 2003; Chen et al., 2004a).

Complete Chapter List

Search this Book:
Editorial Advisory Board
Table of Contents
Sven Buchholz
Tohru Nitta
Chapter 1
Masaki Kobayashi
Information geometry is one of the most effective tools to investigate stochastic learning models. In it, stochastic learning models are regarded as... Sample PDF
Complex-Valued Boltzmann Manifold
Chapter 2
Takehiko Ogawa
Network inversion solves inverse problems to estimate cause from result using a multilayer neural network. The original network inversion has been... Sample PDF
Complex-Valued Neural Network and Inverse Problems
Chapter 3
Boris Igelnik
This chapter describes the clustering ensemble method and the Kolmogorovs Spline Complex Network, in the context of adaptive dynamic modeling of... Sample PDF
Kolmogorovs Spline Complex Network and Adaptive Dynamic Modeling of Data
Chapter 4
V. Srinivasa Chakravarthy
This chapter describes Complex Hopfield Neural Network (CHNN), a complex-variable version of the Hopfield neural network, which can exist in both... Sample PDF
A Complex-Valued Hopfield Neural Network: Dynamics and Applications
Chapter 5
Mitsuo Yoshida, Takehiro Mori
Global stability analysis for complex-valued artificial recurrent neural networks seems to be one of yet-unchallenged topics in information science.... Sample PDF
Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems
Chapter 6
Yasuaki Kuroe
This chapter presents models of fully connected complex-valued neural networks which are complex-valued extension of Hopfield-type neural networks... Sample PDF
Models of Complex-Valued Hopfield-Type Neural Networks and Their Dynamics
Chapter 7
Sheng Chen
The complex-valued radial basis function (RBF) network proposed by Chen et al. (1994) has found many applications for processing complex-valued... Sample PDF
Complex-Valued Symmetric Radial Basis Function Network for Beamforming
Chapter 8
Rajoo Pandey
The equalization of digital communication channel is an important task in high speed data transmission techniques. The multipath channels cause the... Sample PDF
Complex-Valued Neural Networks for Equalization of Communication Channels
Chapter 9
Cheolwoo You, Daesik Hong
In this chapter, the complex Backpropagation (BP) algorithm for the complex backpropagation neural networks (BPN) consisting of the suitable node... Sample PDF
Learning Algorithms for Complex-Valued Neural Networks in Communication Signal Processing and Adaptive Equalization as its Application
Chapter 10
Donq-Liang Lee
New design methods for the complex-valued multistate Hopfield associative memories (CVHAMs) are presented. The author of this chapter shows that the... Sample PDF
Image Reconstruction by the Complex-Valued Neural Networks: Design by Using Generalized Projection Rule
Chapter 11
Naoyuki Morita
The author proposes an automatic estimation method for nuclear magnetic resonance (NMR) spectra of the metabolites in the living body by magnetic... Sample PDF
A Method of Estimation for Magnetic Resonance Spectroscopy Using Complex-Valued Neural Networks
Chapter 12
Michele Scarpiniti, Daniele Vigliano, Raffaele Parisi, Aurelio Uncini
This chapter aims at introducing an Independent Component Analysis (ICA) approach to the separation of linear and nonlinear mixtures in complex... Sample PDF
Flexible Blind Signal Separation in the Complex Domain
Chapter 13
Nobuyuki Matsui, Haruhiko Nishimura, Teijiro Isokawa
Recently, quantum neural networks have been explored as one of the candidates for improving the computational efficiency of neural networks. In this... Sample PDF
Qubit Neural Network: Its Performance and Applications
Chapter 14
Shigeo Sato, Mitsunaga Kinjo
The advantage of quantum mechanical dynamics in information processing has attracted much interest, and dedicated studies on quantum computation... Sample PDF
Neuromorphic Adiabatic Quantum Computation
Chapter 15
G.G. Rigatos, S.G. Tzafestas
Neural computation based on principles of quantum mechanics can provide improved models of memory processes and brain functioning and is of primary... Sample PDF
Attractors and Energy Spectrum of Neural Structures Based on the Model of the Quantum Harmonic Oscillator
Chapter 16
Teijiro Isokawa, Nobuyuki Matsui, Haruhiko Nishimura
Quaternions are a class of hypercomplex number systems, a four-dimensional extension of imaginary numbers, which are extensively used in various... Sample PDF
Quaternionic Neural Networks: Fundamental Properties and Applications
About the Contributors