Image Reconstruction by the Complex-Valued Neural Networks: Design by Using Generalized Projection Rule

Image Reconstruction by the Complex-Valued Neural Networks: Design by Using Generalized Projection Rule

Donq-Liang Lee (Ming-Chuan University, Taiwan)
DOI: 10.4018/978-1-60566-214-5.ch010
OnDemand PDF Download:


New design methods for the complex-valued multistate Hopfield associative memories (CVHAMs) are presented. The author of this chapter shows that the well-known projection rule can be generalized to complex domain such that the weight matrix of the CVHAM can be designed by using the generalized inverse technique. The stability of the presented CVHAM is analyzed by using energy function approach which shows that in synchronous update mode a CVHAM is guaranteed to converge to a fixed point from any given initial state. Moreover, the projection geometry of the generalized projection rule is discussed. In order to enhance the recall capability, a strategy of eliminating the spurious memories is reported. Next, a generalized intraconnected bidirectional associative memory (GIBAM) is introduced. A GIBAM is a complex generalization of the intraconnected BAM (IBAM). Lee shows that the design of the GIBAM can also be accomplished by using the generalized inverse technique. Finally, the validity and the performance of the introduced methods are investigated by computer simulation.
Chapter Preview


Storing gray-scale images with neural networks is a challenging problem and has received much attention in the past two decades. There are four main approaches for storing images with n pixels and K gray levels. The first approach is to encode the gray level of each pixel by R () binary neurons (Taketa & Goodman, 1986; Cernuschi-Frias, 1989; Lee, 1999). However, this method needs great numbers of neurons (nR) and interconnection weights (). The second approach is based on neural networks with multivalued stable states (Si & Michel, 1991; Zurada, Cloete, & van der Poel, 1996). The activation function is a quantized nonlinearity with K plateaus corresponding to the K gray levels. The required number of neurons is n and the number of interconnections is. The third approach is to decompose the K gray-level image into R gray-coded images (Costantini, Casali, & Pefetti, 2003). These images are then stored by using R independent binary neural networks. The required number of interconnections is. The fourth approach is based on complex-valued neural networks (Jankowski, Lozowski, & Zurada, 1996; Lee, 2001a, 2001b, 2003). The neuron state can assume one of K complex values, equally spaced on the unit circle. Each phase angle corresponds to a gray level. The number of neurons is n; the number of interconnections is.

The objective of this chapter is to review and discuss some recent developments of the complex-valued neural networks (CVNNs). Here we present two types of CVNNs: auto-associative network and hetero-associative network. Network structures, evolution equations, stability, and design methods are discussed in detail. For simplicity, the CVNNs considered here have no threshold (bias) vectors though the including of them adds substantial degree of freedom to the design

The complex-valued Hopfield associative memory (CVHAM) proposed by Jankowski, Lozowski, and Zurada (1996) is a kind of auto-associative network. It can be referred to as a modified Hopfield network (Hopfield, 1984) having complex-signum activation functions and complex weighting connections. The learning algorithms of conventional CVHAMs include: generalized Hebb rule (Jankowski, Lozowski, & Zurada, 1996), gradient descent learning rule (Lee, 2001a, 2003), energy design method (Müezzinoğlu, Güzeliş, & Zurada, 2003), etc. However, the recall capability of the CVHAMs is limited because these methods do not consider the attraction of the fixed point seriously. In this chapter, the generalized projection rule for the CVHAM is introduced.

In 1991, Jeng and Yeh introduced a modified intraconnected bidirectional associative memory (MIBAM). It is a two-layer hetero–associative memory in which the intralayer feedback processes run in parallel, instead of sequentially as in the IBAM (Simpson, 1990), with the interlayer processes. Compared with IBAM, the MIBAM yields both improved storage capacity and error correcting capability. However, the improvements are minor because the design of the MIBAM does not seriously consider the safe storage of all training pairs. With the help of the generalized inverse technique, a generalized model of the IBAM (GIBAM) is proposed in this chapter. Computer simulation demonstrates that the GIBAM has better recall performances than does the MIBAM.

Complete Chapter List

Search this Book:
Editorial Advisory Board
Table of Contents
Sven Buchholz
Tohru Nitta
Chapter 1
Masaki Kobayashi
Information geometry is one of the most effective tools to investigate stochastic learning models. In it, stochastic learning models are regarded as... Sample PDF
Complex-Valued Boltzmann Manifold
Chapter 2
Takehiko Ogawa
Network inversion solves inverse problems to estimate cause from result using a multilayer neural network. The original network inversion has been... Sample PDF
Complex-Valued Neural Network and Inverse Problems
Chapter 3
Boris Igelnik
This chapter describes the clustering ensemble method and the Kolmogorovs Spline Complex Network, in the context of adaptive dynamic modeling of... Sample PDF
Kolmogorovs Spline Complex Network and Adaptive Dynamic Modeling of Data
Chapter 4
V. Srinivasa Chakravarthy
This chapter describes Complex Hopfield Neural Network (CHNN), a complex-variable version of the Hopfield neural network, which can exist in both... Sample PDF
A Complex-Valued Hopfield Neural Network: Dynamics and Applications
Chapter 5
Mitsuo Yoshida, Takehiro Mori
Global stability analysis for complex-valued artificial recurrent neural networks seems to be one of yet-unchallenged topics in information science.... Sample PDF
Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems
Chapter 6
Yasuaki Kuroe
This chapter presents models of fully connected complex-valued neural networks which are complex-valued extension of Hopfield-type neural networks... Sample PDF
Models of Complex-Valued Hopfield-Type Neural Networks and Their Dynamics
Chapter 7
Sheng Chen
The complex-valued radial basis function (RBF) network proposed by Chen et al. (1994) has found many applications for processing complex-valued... Sample PDF
Complex-Valued Symmetric Radial Basis Function Network for Beamforming
Chapter 8
Rajoo Pandey
The equalization of digital communication channel is an important task in high speed data transmission techniques. The multipath channels cause the... Sample PDF
Complex-Valued Neural Networks for Equalization of Communication Channels
Chapter 9
Cheolwoo You, Daesik Hong
In this chapter, the complex Backpropagation (BP) algorithm for the complex backpropagation neural networks (BPN) consisting of the suitable node... Sample PDF
Learning Algorithms for Complex-Valued Neural Networks in Communication Signal Processing and Adaptive Equalization as its Application
Chapter 10
Donq-Liang Lee
New design methods for the complex-valued multistate Hopfield associative memories (CVHAMs) are presented. The author of this chapter shows that the... Sample PDF
Image Reconstruction by the Complex-Valued Neural Networks: Design by Using Generalized Projection Rule
Chapter 11
Naoyuki Morita
The author proposes an automatic estimation method for nuclear magnetic resonance (NMR) spectra of the metabolites in the living body by magnetic... Sample PDF
A Method of Estimation for Magnetic Resonance Spectroscopy Using Complex-Valued Neural Networks
Chapter 12
Michele Scarpiniti, Daniele Vigliano, Raffaele Parisi, Aurelio Uncini
This chapter aims at introducing an Independent Component Analysis (ICA) approach to the separation of linear and nonlinear mixtures in complex... Sample PDF
Flexible Blind Signal Separation in the Complex Domain
Chapter 13
Nobuyuki Matsui, Haruhiko Nishimura, Teijiro Isokawa
Recently, quantum neural networks have been explored as one of the candidates for improving the computational efficiency of neural networks. In this... Sample PDF
Qubit Neural Network: Its Performance and Applications
Chapter 14
Shigeo Sato, Mitsunaga Kinjo
The advantage of quantum mechanical dynamics in information processing has attracted much interest, and dedicated studies on quantum computation... Sample PDF
Neuromorphic Adiabatic Quantum Computation
Chapter 15
G.G. Rigatos, S.G. Tzafestas
Neural computation based on principles of quantum mechanics can provide improved models of memory processes and brain functioning and is of primary... Sample PDF
Attractors and Energy Spectrum of Neural Structures Based on the Model of the Quantum Harmonic Oscillator
Chapter 16
Teijiro Isokawa, Nobuyuki Matsui, Haruhiko Nishimura
Quaternions are a class of hypercomplex number systems, a four-dimensional extension of imaginary numbers, which are extensively used in various... Sample PDF
Quaternionic Neural Networks: Fundamental Properties and Applications
About the Contributors