Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/4944
Title: Discrete-time recurrent neural networks with time-varying delays: Exponential stability analysis
Authors: Liu, Y
Wang, Z
Serrano, A
Liu, X
Keywords: Discrete recurrent neural networks;Exponential stability;Time-varying delays;Lyapunov–Krasovskii functional;Linear matrix inequality
Issue Date: 2007
Publisher: Elsevier
Citation: Physics Letters A, 362(5-6): 480-488, Mar 2007
Abstract: This Letter is concerned with the analysis problem of exponential stability for a class of discrete-time recurrent neural networks (DRNNs) with time delays. The delay is of the time-varying nature, and the activation functions are assumed to be neither differentiable nor strict monotonic. Furthermore, the description of the activation functions is more general than the recently commonly used Lipschitz conditions. Under such mild conditions, we first prove the existence of the equilibrium point. Then, by employing a Lyapunov–Krasovskii functional, a unified linear matrix inequality (LMI) approach is developed to establish sufficient conditions for the DRNNs to be globally exponentially stable. It is shown that the delayed DRNNs are globally exponentially stable if a certain LMI is solvable, where the feasibility of such an LMI can be easily checked by using the numerically efficient Matlab LMI Toolbox. A simulation example is presented to show the usefulness of the derived LMI-based stability condition.
Description: This is the post print version of the article. The official published version can be obtained from the link below - Copyright 2007 Elsevier Ltd
URI: http://bura.brunel.ac.uk/handle/2438/4944
DOI: http://dx.doi.org/10.1016/j.physleta.2006.10.073
ISSN: 0375-9601
Appears in Collections:Computer Science
Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
Fulltext.pdf167.67 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.