Nonstationary Signal Analysis with Kernel Machines

Nonstationary Signal Analysis with Kernel Machines

Paul Honeine (Institut Charles Delaunay, France), Cédric Richard (Institut Charles Delaunay, France) and Patrick Flandrin (Ecole Normale Supérieure de Lyon, France)
DOI: 10.4018/978-1-60566-766-9.ch010
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

This chapter introduces machine learning for nonstationary signal analysis and classification. It argues that machine learning based on the theory of reproducing kernels can be extended to nonstationary signal analysis and classification. The authors show that some specific reproducing kernels allow pattern recognition algorithm to operate in the time-frequency domain. Furthermore, the authors study the selection of the reproducing kernel for a nonstationary signal classification problem. For this purpose, the kernel-target alignment as a selection criterion is investigated, yielding the optimal time-frequency representation for a given classification problem. These links offer new perspectives in the field of nonstationary signal analysis, which can benefit from recent developments of statistical learning theory and pattern recognition.
Chapter Preview
Top

Rkhs And Kernel Machines: A Brief Review

The theory behind RKHS serves as a foundation of the kernel machines. The main building blocks of these statistical learning algorithms are the kernel trick and the Representer Theorem. In this section, these concepts are presented succinctly, after a short introduction on reproducing kernels.

Key Terms in this Chapter

Kernel-Target Alignment Criterion: A criterion to select and tune a kernel for a given learning task, prior to any leaning, by comparing it to an ideal kernel obtained from the available training data.

Positive Definite Kernel: A two-variable function defined on X that satisfies for all x1,…, xn ? X and a1, …, an ? C.

Wavelet Representation: A linear time-frequency representation relying on a time-translation and a scaling of a mother wavelet w, and defined by .

Reproducing Kernel Hilbert Space: A Hilbert space of functions from X to C that possesses a reproducing kernel, i.e. a (positive definite) kernel ?(xi, xj) with the properties: (1) belongs to that space, and (2) , for all x ? X and .

Wigner-Ville Distribution: A High resolution joint time-frequency distribution for nonstationary signals analysis, defined by for a given signal x.

Cohen’s Class of Time-Frequency Distributions: The class of distributions covariant with respect to time and frequency shifts applied to the studied signal. For a given signal x, a distribution belonging to Cohen’s class is given by where Wx is the Wigner-Ville distribution of x and ? is a tunable function.

Short-Time Fourier Transform: A linear time-frequency representation of a signal, defined by for a given analysis window w.

Kernel-PCA: A nonlinear extension of the classical Principal Component Analysis algorithm based on the kernel paradigm, yielding a powerful feature extraction technique.

Complete Chapter List

Search this Book:
Reset