Flexible Blind Signal Separation in the Complex Domain

Flexible Blind Signal Separation in the Complex Domain

Michele Scarpiniti (University of Rome “La Sapienza”, Italy), Daniele Vigliano (University of Rome “La Sapienza”, Italy), Raffaele Parisi (University of Rome “La Sapienza”, Italy) and Aurelio Uncini (University of Rome “La Sapienza”, Italy)
DOI: 10.4018/978-1-60566-214-5.ch012
OnDemand PDF Download:


This chapter aims at introducing an Independent Component Analysis (ICA) approach to the separation of linear and nonlinear mixtures in complex domain. Source separation is performed by an extension of the INFOMAX approach to the complex environment. The neural network approach is based on an adaptive activation function, whose shape is properly modified during learning. Different models have been used to realize complex nonlinear functions for the linear and the nonlinear environment. In nonlinear environment the nonlinear functions involved during the learning are implemented by the so-called splitting functions, working on the real and the imaginary part of the signal. In linear environment instead, the generalized splitting function which performs a more complete representation of complex function is used. Moreover a simple adaptation algorithm is derived and several experimental results are shown to demonstrate the effectiveness of the proposed method.
Chapter Preview


In the last years Blind Source Separation (BSS) realized through Independent Component Analysis (ICA) have raised great interest in the signal processing community (Cichocki & Amari, 2002; Haykin, 2000; Roberts & Everson, 2001). In this context the neural network approach (Haykin, 1999) (usually based on a single layer perceptron (SLP) or a multilayer perceptron (MLP)) seems to be one of the preferred methodologies (Jutten & Herault, 1991; Bell & Sejnowski, 1995); this interest is justified by the large number of different approaches and applications. As a matter of fact, in several fields, from multimedia to telecommunication and to biomedicine, ICA is currently employed to effectively recover the original sources from their mixtures or to remove interfering signals from the signal of interest. Initial studies on ICA aimed at solving the well-known cocktail party problem, in a instantaneous or slightly reverberant environment. Pioneering works in ICA appeared at the beginning of the 90’s, when Jutten and Herault (1991) presented their “neurometric architecture” and Comon (1994) published his often referenced work.

Recently the problem of source separation has been extended to the complex domain (Cardoso & Laheld, 1996; Fiori, Uncini & Piazza, 1999; Bingham & Hyvärinen, 2000), due to the need of frequency domain signal processing which is quite common in telecommunication (Benvenuto, Marchesi, Piazza & Uncini, 1991) and biomedical applications (Calhoun, Adali, Pearlson & Pekar, 2002b; Calhoun, Adali, Pearlson, Van Zijl & Pekar, 2002c). One of the most critical issues in ICA is the matching between the probability density function (or pdf) of sources (usually unknown) and the algorithm’s parameters (Yang & Amari, 1997). In this way one of the most important issues in designing complex neural networks consists in the definition of the complex activation function (Clarke, 1990; Benvenuto & Piazza, 1992; Kim & Adali, 2001a). In order to improve the pdf matching for the learning algorithm, the so called Flexible ICA was recently introduced in (Choi, Cichocki & Amari, 2000; Fiori, 2000; Solazzi, Piazza & Uncini, 2000a; Vigliano & Uncini, 2003; Vigliano, Parisi & Uncini, 2005). Flexible ICA is the approach in which the activation function (AF) of the neural network is adaptively modified during the learning. This approach provides faster and more accurate learning by estimating the parameters related to the pdf of signals. In literature it is possible to find several methods based on polynomials (Amari, Cichocki & Yang, 1996) and on parametric function approaches (Pham, Garrat & Jutten, 1992; Solazzi, Piazza & Uncini, 2001).

Moreover the main properties that the complex activation function should satisfy (Kim & Adali, 2002a; Vitagliano, Parisi & Uncini, 2003) are that it should be non linear and bounded and its partial derivatives should exist and be bounded. Unfortunately the analytic and boundedness characteristics are in contrast with the Liouville theorem (Clarke, 1990; Kim & Adali, 2001a). In other words, according to this theorem, an activation function should be bounded almost everywhere in the complex domain (Clarke, 1990; Leung & Haykin, 1991; Georgiou & Koutsougeras, 1992; Kim & Adali, 2000, 2001a, 2002b; Adali, Kim & Calhoun, 2004).

Complete Chapter List

Search this Book:
Editorial Advisory Board
Table of Contents
Sven Buchholz
Tohru Nitta
Chapter 1
Masaki Kobayashi
Information geometry is one of the most effective tools to investigate stochastic learning models. In it, stochastic learning models are regarded as... Sample PDF
Complex-Valued Boltzmann Manifold
Chapter 2
Takehiko Ogawa
Network inversion solves inverse problems to estimate cause from result using a multilayer neural network. The original network inversion has been... Sample PDF
Complex-Valued Neural Network and Inverse Problems
Chapter 3
Boris Igelnik
This chapter describes the clustering ensemble method and the Kolmogorovs Spline Complex Network, in the context of adaptive dynamic modeling of... Sample PDF
Kolmogorovs Spline Complex Network and Adaptive Dynamic Modeling of Data
Chapter 4
V. Srinivasa Chakravarthy
This chapter describes Complex Hopfield Neural Network (CHNN), a complex-variable version of the Hopfield neural network, which can exist in both... Sample PDF
A Complex-Valued Hopfield Neural Network: Dynamics and Applications
Chapter 5
Mitsuo Yoshida, Takehiro Mori
Global stability analysis for complex-valued artificial recurrent neural networks seems to be one of yet-unchallenged topics in information science.... Sample PDF
Global Stability Analysis for Complex-Valued Recurrent Neural Networks and Its Application to Convex Optimization Problems
Chapter 6
Yasuaki Kuroe
This chapter presents models of fully connected complex-valued neural networks which are complex-valued extension of Hopfield-type neural networks... Sample PDF
Models of Complex-Valued Hopfield-Type Neural Networks and Their Dynamics
Chapter 7
Sheng Chen
The complex-valued radial basis function (RBF) network proposed by Chen et al. (1994) has found many applications for processing complex-valued... Sample PDF
Complex-Valued Symmetric Radial Basis Function Network for Beamforming
Chapter 8
Rajoo Pandey
The equalization of digital communication channel is an important task in high speed data transmission techniques. The multipath channels cause the... Sample PDF
Complex-Valued Neural Networks for Equalization of Communication Channels
Chapter 9
Cheolwoo You, Daesik Hong
In this chapter, the complex Backpropagation (BP) algorithm for the complex backpropagation neural networks (BPN) consisting of the suitable node... Sample PDF
Learning Algorithms for Complex-Valued Neural Networks in Communication Signal Processing and Adaptive Equalization as its Application
Chapter 10
Donq-Liang Lee
New design methods for the complex-valued multistate Hopfield associative memories (CVHAMs) are presented. The author of this chapter shows that the... Sample PDF
Image Reconstruction by the Complex-Valued Neural Networks: Design by Using Generalized Projection Rule
Chapter 11
Naoyuki Morita
The author proposes an automatic estimation method for nuclear magnetic resonance (NMR) spectra of the metabolites in the living body by magnetic... Sample PDF
A Method of Estimation for Magnetic Resonance Spectroscopy Using Complex-Valued Neural Networks
Chapter 12
Michele Scarpiniti, Daniele Vigliano, Raffaele Parisi, Aurelio Uncini
This chapter aims at introducing an Independent Component Analysis (ICA) approach to the separation of linear and nonlinear mixtures in complex... Sample PDF
Flexible Blind Signal Separation in the Complex Domain
Chapter 13
Nobuyuki Matsui, Haruhiko Nishimura, Teijiro Isokawa
Recently, quantum neural networks have been explored as one of the candidates for improving the computational efficiency of neural networks. In this... Sample PDF
Qubit Neural Network: Its Performance and Applications
Chapter 14
Shigeo Sato, Mitsunaga Kinjo
The advantage of quantum mechanical dynamics in information processing has attracted much interest, and dedicated studies on quantum computation... Sample PDF
Neuromorphic Adiabatic Quantum Computation
Chapter 15
G.G. Rigatos, S.G. Tzafestas
Neural computation based on principles of quantum mechanics can provide improved models of memory processes and brain functioning and is of primary... Sample PDF
Attractors and Energy Spectrum of Neural Structures Based on the Model of the Quantum Harmonic Oscillator
Chapter 16
Teijiro Isokawa, Nobuyuki Matsui, Haruhiko Nishimura
Quaternions are a class of hypercomplex number systems, a four-dimensional extension of imaginary numbers, which are extensively used in various... Sample PDF
Quaternionic Neural Networks: Fundamental Properties and Applications
About the Contributors