Complex-Valued Boltzmann Manifold

Complex-Valued Boltzmann Manifold

Masaki Kobayashi (University of Yamanashi, Japan)
DOI: 10.4018/978-1-60566-214-5.ch001
OnDemand PDF Download:
$37.50

Abstract

Information geometry is one of the most effective tools to investigate stochastic learning models. In it, stochastic learning models are regarded as manifolds in the view of differential geometry. Amari applied it to Boltzmann Machines, which is one of the stochastic learning models. The purpose of this chapter is to apply information geometry to complex-valued Boltzmann Machines. First, we construct the complex-valued Boltzmann Machines. Next, the author describes information geometry. The author will know some important notions of information geometry, exponential families, mixture families, Kullback-Leibler divergence, connections, geodesics, Fisher metrics, potential functions and so on. Finally, they apply information geometry to complex-valued Boltzmann Machines. They will investigate the structure of complex-valued Boltzmann manifold and know the notions of the connections and Fisher metric. Moreover we will get an effective learning algorithm, what is called em algorithm, for complex-valued Boltzmann machines with hidden neurons.
Chapter Preview
Top

Introduction

These days we can get massive information and it is hard to deal with it without computers. Machine learning is effective for computers to manage massive information. Machine learning uses various learning machine models, for instance, decision trees, Bayesian Networks, Support Vector Machine, Hidden Markov Model, normal mixed distributions, neural networks and so on. Some of them are stochastically constructed.

The neural network is one of the learning machine models. It consists of many units, which are called neurons. We often use binary neurons. Then each neuron takes only two states. The set of neurons, however, takes many states. Various types of neural networks have been proposed. Feed-forward types and symmetric types of neural networks are main models. Feed-forward types of neural networks are often applied to recognize given patterns and are so useful. Symmetric types of neural network are often applied as Associative Memories. The Hopfield Network is one of the most famous models. Boltzmann Machines are stochastic types of Hopfield Networks.

Neurons take only two states in most cases. McEliece, Posner, Rodemich and Venkatech (1987) is a highly recognized critique of the Hopfield memory low capacity. Baldi and Hormik (1989) rigorously showed existence of numerous local minima in objective function for learning a nonlinear perceptron. The presentation capacity of a neuron is too poor. We hope that the neuron models have multi-states. Some researchers have proposed such models. Multi-level neuron is one of them (Zurada, Cloete & Poel, 1996). Complex-valued neurons are also multi-states neuron models. Several models of complex-valued neurons have been proposed, for example, phasor neurons (Noest, 1988a), discrete-state phasor neurons (Noest, 1988b), amplitude-phase type of complex-valued neurons (Hirose, 1992; Kuroe, 2003), real part – imaginary part type of complex-valued neurons (Benvenuto & Piazza, 1992; Nitta & Furuta, 1991; Nitta, 1997) and so on (Nemoto & Kubono, 1996; Nemoto, 2003). In this chapter, we deal with the phasor neurons and the discrete phasor neurons. In this chapter, we call phasor neurons continuous phasor neurons to distinguish them clearly from discrete phasor neurons.

Several types of neural networks, feed-forward neural networks, Hopfield Networks and Boltzmann Machines, have been extended to the complex-valued neural networks. Noest proposed complex-valued Hopfield Networks (Noest, 1988a; Noest, 1988b). They are called continuous phasor or discrete phasor neural networks. Hirose proposed back propagation learning algorithms for the amplitude-phase type of complex-valued neural networks (Hirose, 1992). Benvenuto and Piazza (1992) and Nitta and Furuya (1991) independently proposed back propagation learning algorithms for the real part – imaginary part type of complex-valued neural networks. Boltzmann Machines were also extended to the complex-valued Boltzmann Machines (Zemel, Williams & Mozer, 1993; Zemel, Williams & Mozer, 1995). The complex-valued Boltzmann Machines proposed by Zemel et al. (1993) are continuous models. Kobayashi and Yamazaki proposed the discrete version of complex-valued Boltzmann Machines (Kobayashi & Yamazaki, 2003).

Complete Chapter List

Search this Book:
Reset
Editorial Advisory Board
Table of Contents
Foreword
Sven Buchholz
Acknowledgment
Tohru Nitta
Chapter 1
Masaki Kobayashi
Information geometry is one of the most effective tools to investigate stochastic learning models. In it, stochastic learning models are regarded as... Sample PDF
Complex-Valued Boltzmann Manifold
$37.50
Chapter 2
Takehiko Ogawa
Network inversion solves inverse problems to estimate cause from result using a multilayer neural network. The original network inversion has been... Sample PDF
Complex-Valued Neural Network and Inverse Problems
$37.50
Chapter 3
Boris Igelnik
This chapter describes the clustering ensemble method and the Kolmogorovs Spline Complex Network, in the context of adaptive dynamic modeling of... Sample PDF
Kolmogorovs Spline Complex Network and Adaptive Dynamic Modeling of Data
$37.50
Chapter 4
V. Srinivasa Chakravarthy
This chapter describes Complex Hopfield Neural Network (CHNN), a complex-variable version of the Hopfield neural network, which can exist in both... Sample PDF
A Complex-Valued Hopfield Neural Network: Dynamics and Applications
$37.50
Chapter 5
Mitsuo Yoshida, Takehiro Mori
Global stability analysis for complex-valued artificial recurrent neural networks seems to be one of yet-unchallenged topics in information science.... Sample PDF
Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems
$37.50
Chapter 6
Yasuaki Kuroe
This chapter presents models of fully connected complex-valued neural networks which are complex-valued extension of Hopfield-type neural networks... Sample PDF
Models of Complex-Valued Hopfield-Type Neural Networks and Their Dynamics
$37.50
Chapter 7
Sheng Chen
The complex-valued radial basis function (RBF) network proposed by Chen et al. (1994) has found many applications for processing complex-valued... Sample PDF
Complex-Valued Symmetric Radial Basis Function Network for Beamforming
$37.50
Chapter 8
Rajoo Pandey
The equalization of digital communication channel is an important task in high speed data transmission techniques. The multipath channels cause the... Sample PDF
Complex-Valued Neural Networks for Equalization of Communication Channels
$37.50
Chapter 9
Cheolwoo You, Daesik Hong
In this chapter, the complex Backpropagation (BP) algorithm for the complex backpropagation neural networks (BPN) consisting of the suitable node... Sample PDF
Learning Algorithms for Complex-Valued Neural Networks in Communication Signal Processing and Adaptive Equalization as its Application
$37.50
Chapter 10
Donq-Liang Lee
New design methods for the complex-valued multistate Hopfield associative memories (CVHAMs) are presented. The author of this chapter shows that the... Sample PDF
Image Reconstruction by the Complex-Valued Neural Networks: Design by Using Generalized Projection Rule
$37.50
Chapter 11
Naoyuki Morita
The author proposes an automatic estimation method for nuclear magnetic resonance (NMR) spectra of the metabolites in the living body by magnetic... Sample PDF
A Method of Estimation for Magnetic Resonance Spectroscopy Using Complex-Valued Neural Networks
$37.50
Chapter 12
Michele Scarpiniti, Daniele Vigliano, Raffaele Parisi, Aurelio Uncini
This chapter aims at introducing an Independent Component Analysis (ICA) approach to the separation of linear and nonlinear mixtures in complex... Sample PDF
Flexible Blind Signal Separation in the Complex Domain
$37.50
Chapter 13
Nobuyuki Matsui, Haruhiko Nishimura, Teijiro Isokawa
Recently, quantum neural networks have been explored as one of the candidates for improving the computational efficiency of neural networks. In this... Sample PDF
Qubit Neural Network: Its Performance and Applications
$37.50
Chapter 14
Shigeo Sato, Mitsunaga Kinjo
The advantage of quantum mechanical dynamics in information processing has attracted much interest, and dedicated studies on quantum computation... Sample PDF
Neuromorphic Adiabatic Quantum Computation
$37.50
Chapter 15
G.G. Rigatos, S.G. Tzafestas
Neural computation based on principles of quantum mechanics can provide improved models of memory processes and brain functioning and is of primary... Sample PDF
Attractors and Energy Spectrum of Neural Structures Based on the Model of the Quantum Harmonic Oscillator
$37.50
Chapter 16
Teijiro Isokawa, Nobuyuki Matsui, Haruhiko Nishimura
Quaternions are a class of hypercomplex number systems, a four-dimensional extension of imaginary numbers, which are extensively used in various... Sample PDF
Quaternionic Neural Networks: Fundamental Properties and Applications
$37.50
About the Contributors