Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems

Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems

Mitsuo Yoshida (Kyoto Institute of Technology, Japan) and Takehiro Mori (Kyoto Institute of Technology, Japan)
DOI: 10.4018/978-1-60566-214-5.ch005
OnDemand PDF Download:
$37.50

Abstract

Global stability analysis for complex-valued artificial recurrent neural networks seems to be one of yet-unchallenged topics in information science. This chapter presents global stability conditions for discrete-time and continuous- time complex-valued recurrent neural networks, which are regarded as nonlinear dynamical systems. Global asymptotic stability conditions for these networks are derived by way of suitable choices of activation functions. According to these stability conditions, there are classes of discrete-time and continuous-time complex-valued recurrent neural networks whose equilibrium point is globally asymptotically stable. Furthermore, the conditions are shown to be successfully applicable to solving convex programming problems, for which real field solution methods are generally tedious.
Chapter Preview
Top

Introduction

Recurrent neural networks whose neurons are fully interconnected have been utilized to implement associative memories and solve optimization problems. These networks are regarded as nonlinear dynamical feedback systems. Stability properties of this class of dynamical networks are an important issue from applications point of view.

On the other hand, several models of neural networks that can deal with complex numbers, the complex-valued neural networks, have come to forth in recent years. These networks have states, connection weights, and activation functions, which are all complex-valued. Such networks have been studied in terms of their abilities of information processing, because they possess attractive features which do not exist in their real-valued counterparts (Hirose, 2003; Kuroe, Hashimoto & Mori, 2001, 2002; Kuroe, Yoshida & Mori, 2003; Nitta, 2000; Takeda & Kishigami, 1992; Yoshida, Mori & Kuroe, 2004; Yoshida & Mori, 2007). Generally, activation functions of neural networks crucially determine their dynamic behavior. In complex-valued neural networks, there is a greater choice of activation functions compared to real-valued networks. However, the question of appropriate activation functions has been paid insufficient attention to in the past.

Local asymptotic stability conditions for complex-valued recurrent neural networks with an energy function defined on the complex domain have been studied earlier and synthesis of complex-valued associative memories has been realized (Kuroe et al., 2001, 2002). However, studies on their application to global optimization problems and theoretical analysis for global asymptotic stability conditions remain yet-unchallenged topics.

The purpose of this chapter is to analyze global asymptotic stability for complex-valued recurrent neural networks. Two types of complex-valued recurrent neural networks are considered: discrete-time model and continuous-time model. We present global asymptotic stability conditions for both models of the complex-valued recurrent neural networks. To ensure global stability, classes of complex-valued functions are defined as the activation functions, and therewith several stability conditions are obtained. According to these conditions, there are classes of discrete-time and continuous-time complex-valued recurrent neural networks whose common equilibrium point is globally asymptotically stable. Furthermore, the obtained conditions are shown to be successfully applicable to solving convex programming problems.

The chapter is organized as follows. In Background, a brief summary of applications to associative memories and optimization problems in real-valued recurrent neural networks are presented. Moreover, results on stability analysis and applications of these real-valued neural networks are introduced. Next, models of discrete-time and continuous-time complex-valued neural networks are described. For activation functions of these networks, two classes of complex-valued function are defined. In the next section, global asymptotic stability conditions for the discrete-time and continuous-time complex-valued neural networks are proved, respectively. Some discussions thereof are also given. Furthermore, applications of complex-valued neural networks to convex programming problems with numerical examples are shown in the subsequent section. Finally, concluding remarks and future research directions are given.

Complete Chapter List

Search this Book:
Reset
Editorial Advisory Board
Table of Contents
Foreword
Sven Buchholz
Acknowledgment
Tohru Nitta
Chapter 1
Masaki Kobayashi
Information geometry is one of the most effective tools to investigate stochastic learning models. In it, stochastic learning models are regarded as... Sample PDF
Complex-Valued Boltzmann Manifold
$37.50
Chapter 2
Takehiko Ogawa
Network inversion solves inverse problems to estimate cause from result using a multilayer neural network. The original network inversion has been... Sample PDF
Complex-Valued Neural Network and Inverse Problems
$37.50
Chapter 3
Boris Igelnik
This chapter describes the clustering ensemble method and the Kolmogorovs Spline Complex Network, in the context of adaptive dynamic modeling of... Sample PDF
Kolmogorovs Spline Complex Network and Adaptive Dynamic Modeling of Data
$37.50
Chapter 4
V. Srinivasa Chakravarthy
This chapter describes Complex Hopfield Neural Network (CHNN), a complex-variable version of the Hopfield neural network, which can exist in both... Sample PDF
A Complex-Valued Hopfield Neural Network: Dynamics and Applications
$37.50
Chapter 5
Mitsuo Yoshida, Takehiro Mori
Global stability analysis for complex-valued artificial recurrent neural networks seems to be one of yet-unchallenged topics in information science.... Sample PDF
Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems
$37.50
Chapter 6
Yasuaki Kuroe
This chapter presents models of fully connected complex-valued neural networks which are complex-valued extension of Hopfield-type neural networks... Sample PDF
Models of Complex-Valued Hopfield-Type Neural Networks and Their Dynamics
$37.50
Chapter 7
Sheng Chen
The complex-valued radial basis function (RBF) network proposed by Chen et al. (1994) has found many applications for processing complex-valued... Sample PDF
Complex-Valued Symmetric Radial Basis Function Network for Beamforming
$37.50
Chapter 8
Rajoo Pandey
The equalization of digital communication channel is an important task in high speed data transmission techniques. The multipath channels cause the... Sample PDF
Complex-Valued Neural Networks for Equalization of Communication Channels
$37.50
Chapter 9
Cheolwoo You, Daesik Hong
In this chapter, the complex Backpropagation (BP) algorithm for the complex backpropagation neural networks (BPN) consisting of the suitable node... Sample PDF
Learning Algorithms for Complex-Valued Neural Networks in Communication Signal Processing and Adaptive Equalization as its Application
$37.50
Chapter 10
Donq-Liang Lee
New design methods for the complex-valued multistate Hopfield associative memories (CVHAMs) are presented. The author of this chapter shows that the... Sample PDF
Image Reconstruction by the Complex-Valued Neural Networks: Design by Using Generalized Projection Rule
$37.50
Chapter 11
Naoyuki Morita
The author proposes an automatic estimation method for nuclear magnetic resonance (NMR) spectra of the metabolites in the living body by magnetic... Sample PDF
A Method of Estimation for Magnetic Resonance Spectroscopy Using Complex-Valued Neural Networks
$37.50
Chapter 12
Michele Scarpiniti, Daniele Vigliano, Raffaele Parisi, Aurelio Uncini
This chapter aims at introducing an Independent Component Analysis (ICA) approach to the separation of linear and nonlinear mixtures in complex... Sample PDF
Flexible Blind Signal Separation in the Complex Domain
$37.50
Chapter 13
Nobuyuki Matsui, Haruhiko Nishimura, Teijiro Isokawa
Recently, quantum neural networks have been explored as one of the candidates for improving the computational efficiency of neural networks. In this... Sample PDF
Qubit Neural Network: Its Performance and Applications
$37.50
Chapter 14
Shigeo Sato, Mitsunaga Kinjo
The advantage of quantum mechanical dynamics in information processing has attracted much interest, and dedicated studies on quantum computation... Sample PDF
Neuromorphic Adiabatic Quantum Computation
$37.50
Chapter 15
G.G. Rigatos, S.G. Tzafestas
Neural computation based on principles of quantum mechanics can provide improved models of memory processes and brain functioning and is of primary... Sample PDF
Attractors and Energy Spectrum of Neural Structures Based on the Model of the Quantum Harmonic Oscillator
$37.50
Chapter 16
Teijiro Isokawa, Nobuyuki Matsui, Haruhiko Nishimura
Quaternions are a class of hypercomplex number systems, a four-dimensional extension of imaginary numbers, which are extensively used in various... Sample PDF
Quaternionic Neural Networks: Fundamental Properties and Applications
$37.50
About the Contributors