Chaotic Liquid State Machine

Chaotic Liquid State Machine

Mario Antoine Aoun (Computer Science Department, Faculty of Science, Université du Québec à Montréal, UQAM, Montréal, Canada) and Mounir Boukadoum (Computer Science Department, Faculty of Science, Université du Québec à Montréal, UQAM, Montréal, Canada)
DOI: 10.4018/IJCINI.2015100101
OnDemand PDF Download:
$30.00
List Price: $37.50

Abstract

The authors implement a Liquid State Machine composed from a pool of chaotic spiking neurons. Furthermore, a synaptic plasticity mechanism operates on the connection weights between the neurons inside the pool. A special feature of the system's classification capability is that it can learn the class of a set of time varying inputs when trained from positive examples only, thus, it is a one class classifier. To demonstrate the applicability of this novel neurocomputing architecture, the authors apply it for Online Signature Verification.
Article Preview

1. Introduction

Reservoir Computing (Maass, Natschläger & Markram, 2002; Jaeger, Maass & Principe, 2007) design methodology consists of building a neural network that is divided into three parts. The first part is an input layer. The second part is the reservoir, namely a pool, consisting of randomly and recurrently connected nodes with fixed connections’ weights. These nodes can be any type of spiking neurons. The third part is an output layer consisting of a read out mechanism that is able to read the transient activity of the pool (i.e. the activation of every node in the pool) and perform a classification task accordingly. In the context of Reservoir Computing, an input should be a time varying signal. Furthermore, the connections from the input layer to the pool, and the connections between the neurons that constitute the pool, have weights that are randomly set. All these connections maintain fixed weights during training (i.e. their weights are kept fixed and won’t undergo any changes). The connections from the pool to the output layer are also randomly set, but they have flexible weights; this means, they are the only connections of the network that undergo training (i.e. their weights are updated during training). Fundamental models of Reservoir Computing are the Liquid State Machines (LSM) (Maass, Natschläger & Markram, 2002; Maass & Markram, 2004), Echo State Networks (ESN) (Jaeger, Maass & Principe, 2007) and Nonlinear Transient Computation (NTC) (Crook, 2007). Inspired by the work of Maass and Crook, we implement the chaotic Liquid State Machine model that is presented herein. First, the Chaotic LSM incorporates a synaptic plasticity mechanism inside the liquid layer, which is an approach suggested by Mass et al. (2002) to enhance the ‘Separation Property’ of the machine. Second, the Chaotic LSM uses a minimal number of chaotic spiking neurons inside the liquid layer; an approach suggested by Crook (2007), which can offer a substitute to the nonlinear dynamics offered by a large number of simple Leaky Integrate and Fire (LIF) neurons that are commonly used in the traditional design of a LSM. Third, the Chaotic LSM implements the theory of Delay Feedback Control (DFC) (Pyragas, 2003); on the neurons connections, to stabilize the chaotic dynamics of the liquid when the latter is fed with external input. In such chaos control scheme, the chaotic LSM operates on the critical region between chaos and order (i.e. operates on the edge of chaos (Langton, 1990; Natschläger, Bertschinger & Legenstein, 2004)); which can contribute to its generalization capability, especially since it is combined with synaptic plasticity, synaptic scaling and the use of one class similar training inputs (Maass, Legenstein & Bertschinger, 2005; Legenstein & Maass, 2006; Legenstein & Maass, 2007).

Complete Article List

Search this Journal:
Reset
Open Access Articles: Forthcoming
Volume 11: 4 Issues (2017)
Volume 10: 4 Issues (2016)
Volume 9: 4 Issues (2015)
Volume 8: 4 Issues (2014)
Volume 7: 4 Issues (2013)
Volume 6: 4 Issues (2012)
Volume 5: 4 Issues (2011)
Volume 4: 4 Issues (2010)
Volume 3: 4 Issues (2009)
Volume 2: 4 Issues (2008)
Volume 1: 4 Issues (2007)
View Complete Journal Contents Listing