Stability of Discrete Recurrent Neural Networks with Interval Delays: Global Results

Stability of Discrete Recurrent Neural Networks with Interval Delays: Global Results

Magdi S. Mahmoud, Fouad M. AL Sunni
Copyright: © 2012 |Pages: 14
DOI: 10.4018/ijsda.2012040101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

A global exponential stability method for a class of discrete time recurrent neural networks with interval time-varying delays and norm-bounded time-varying parameter uncertainties is developed in this paper. The method is derived based on a new Lyapunov-Krasovskii functional to exhibit the delay-range-dependent dynamics and to compensate for the enlarged time-span. In addition, it eliminates the need for over bounding and utilizes smaller number of LMI decision variables. Effective solutions to the global stability problem are provided in terms of feasibility-testing of parameterized linear matrix inequalities (LMIs). Numerical examples are presented to demonstrate the potential of the developed technique.
Article Preview
Top

1. Introduction

The research of neural networks has experienced resurgence during three decades and is believed to be initiated by several seminal works in the area of recurrent neural networks (RNNs). The hallmark of an RNN, in contrast to feed-forward neural networks, is the existence of connections from posterior layer(s) to anterior layer(s) or connections among neurons in the same layer. Because of these connections, the networks become dynamic systems, which bring many promising capabilities that the feed-forward counterparts do not possess. One of the obvious capabilities of RNNs is that they can handle temporal information directly and naturally, whereas feed-forward networks have to convert the patterns from temporal domain into spatial domain first for further processing. Two other distinguished capabilities possessed by RNNs refer to associative memory and optimization. The field of RNNs has evolved rapidly in recent years. It has become a fusion of a number of research areas in engineering, computer science, mathematics, artificial intelligence, operations research, systems theory, biology, and neuroscience. RNNs have been widely applied for control, optimization, pattern-recognition, image processing, and signal processing (Rovithakis & Christodoulou, 2000).

Recent research investigations have pointed out the key role of signal transmission delays to the extent that they might cause instability and oscillatory behavior of the neural networks (Arik, 2002) or might lead to poor performance. Typically, time-delays are inevitably encountered in RNNs since the interactions between different neurons are asynchronous. Therefore, stability analysis of RNNs with time-delays has been the subject of numerous studies and many results have been appeared in the literature including existence of periodic solutions, global asymptotic stability and global exponential stability, see Cao and Wang (2003, 2005), Cao and Ho (2005), Chen et al. (2006a, 2006b), Haykin (1994), He et al. (2007), Hu and Wang (2006), Jagannathan and Lewis (1996), Jin et al. (1994), Liang et al. (2005), Liao and Wang (2000), and Liu et al. (2007).

RNNs might be dealt with in continuous-time or discrete-time manner. It has been pointed out in Mohamed and Gopbalsamy (2003) that the discretization process cannot preserve the dynamics of th and let F(z),y1(z),y2(z),…,yk(z) be some functionals or functions. Define domain D asD{z∈Z:y1(z)≥0, y2(z)≥0,…,yk(z)≥0}and the two following conditions:

  • (I)

    F(z)>0,∀z∈D,

  • (II)

    ∃ε1≥0, ε2≥0,…,εk≥0

such that

Then (II) implies (I).

This procedure, whenever applicable, is useful in converting non-strict LMIs into strict LMIs. Sometimes, the arguments of a function will be omitted when no confusion can arise.

Complete Article List

Search this Journal:
Reset
Volume 12: 1 Issue (2024): Forthcoming, Available for Pre-Order
Volume 11: 5 Issues (2022)
Volume 10: 4 Issues (2021)
Volume 9: 4 Issues (2020)
Volume 8: 4 Issues (2019)
Volume 7: 4 Issues (2018)
Volume 6: 4 Issues (2017)
Volume 5: 4 Issues (2016)
Volume 4: 4 Issues (2015)
Volume 3: 4 Issues (2014)
Volume 2: 4 Issues (2013)
Volume 1: 4 Issues (2012)
View Complete Journal Contents Listing