scholarly journals Global Robust Exponential Dissipativity for Interval Recurrent Neural Networks with Infinity Distributed Delays

2013 ◽  
Vol 2013 ◽  
pp. 1-16 ◽  
Author(s):  
Xiaohong Wang ◽  
Huan Qi

This paper is concerned with the robust dissipativity problem for interval recurrent neural networks (IRNNs) with general activation functions, and continuous time-varying delay, and infinity distributed time delay. By employing a new differential inequality, constructing two different kinds of Lyapunov functions, and abandoning the limitation on activation functions being bounded, monotonous and differentiable, several sufficient conditions are established to guarantee the global robust exponential dissipativity for the addressed IRNNs in terms of linear matrix inequalities (LMIs) which can be easily checked by LMI Control Toolbox in MATLAB. Furthermore, the specific estimation of positive invariant and global exponential attractive sets of the addressed system is also derived. Compared with the previous literatures, the results obtained in this paper are shown to improve and extend the earlier global dissipativity conclusions. Finally, two numerical examples are provided to demonstrate the potential effectiveness of the proposed results.

2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Lei Ding ◽  
Hong-Bing Zeng ◽  
Wei Wang ◽  
Fei Yu

This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods.


2015 ◽  
Vol 742 ◽  
pp. 399-403
Author(s):  
Ya Jun Li ◽  
Jing Zhao Li

This paper investigates the exponential stability problem for a class of stochastic neural networks with leakage delay. By employing a suitable Lyapunov functional and stochastic stability theory technic, the sufficient conditions which make the stochastic neural networks system exponential mean square stable are proposed and proved. All results are expressed in terms of linear matrix inequalities (LMIs). Example and simulation are presented to show the effectiveness of the proposed method.


2012 ◽  
Vol 2012 ◽  
pp. 1-21 ◽  
Author(s):  
W. Weera ◽  
P. Niamsup

The problem of exponential stabilization of neutral-type neural networks with various activation functions and interval nondifferentiable and distributed time-varying delays is considered. The interval time-varying delay function is not required to be differentiable. By employing new and improved Lyapunov-Krasovskii functional combined with Leibniz-Newton’s formula, the stabilizability criteria are formulated in terms of a linear matrix inequalities. Numerical examples are given to illustrate and show the effectiveness of the obtained results.


2015 ◽  
Vol 137 (4) ◽  
Author(s):  
Pin-Lin Liu

In this paper, the problems of determining the robust exponential stability and estimating the exponential convergence rate for recurrent neural networks (RNNs) with parametric uncertainties and time-varying delay are studied. The relationship among the time-varying delay, its upper bound, and their difference is taken into account. The developed stability conditions are in terms of linear matrix inequalities (LMIs) and the integral inequality approach (IIA), which can be checked easily by recently developed algorithms solving LMIs. Furthermore, the proposed stability conditions are less conservative than some recently known ones in the literature, and this has been demonstrated via four examples with simulation.


2007 ◽  
Vol 17 (08) ◽  
pp. 2723-2738 ◽  
Author(s):  
XIA HUANG ◽  
JAMES LAM ◽  
JINDE CAO ◽  
SHENGYUAN XU

In this paper, the robust synchronization problem is addressed for recurrent neural networks with time-varying delay by linear feedback control. Robustness in the present paper is referred to as the allowance of parameters mismatch between the drive system and the response system. Sufficient conditions for robust synchronization with a synchronization error bound, expressed as linear matrix inequality (LMI), are derived based on Lyapunov–Krasovskii functionals. Both delay-dependent and delay-independent conditions are obtained. Two examples are given to illustrate the results.


Author(s):  
Umesh Kumar ◽  
Subir Das ◽  
Chuangxia Huang ◽  
Jinde Cao

In this article, sufficient conditions for fixed-time synchronization of time-delayed quaternion-valued neural networks (QVNNs) are derived. Firstly, QVNNs are decomposed into four real-valued systems. Then using the available lemmas and by constructing the Lyapunov function, the synchronization criterion for the neural networks is proposed. Activation functions satisfy the Lipschitz condition. A suitable controller has been designed to synchronize the master–slave systems. The effectiveness of the proposed result is validated through a comparison of the settling time obtained by applying two different existing lemmas to a particular problem of synchronization of two identical QVNNs with time-varying delay with the help of suitable controllers.


Author(s):  
Mengying Ding ◽  
Yali Dong

This paper is concerned with the problem of robust finite-time boundedness for the discrete-time neural networks with time-varying delays. By constructing an appropriate Lyapunov-Krasovskii functional, we propose the sufficient conditions which ensure the robust finite-time boundedness of the discrete-time neural networks with time-varying delay in terms of linear matrix inequalities. Then the sufficient conditions of robust finite-time stability for the discrete-time neural networks with time-varying delays are given. Finally, a numerical example is presented to illustrate the efficiency of proposed methods.


2009 ◽  
Vol 2009 ◽  
pp. 1-14 ◽  
Author(s):  
Chien-Yu Lu ◽  
Chin-Wen Liao ◽  
Hsun-Heng Tsai

This paper examines a passivity analysis for a class of discrete-time recurrent neural networks (DRNNs) with norm-bounded time-varying parameter uncertainties and interval time-varying delay. The activation functions are assumed to be globally Lipschitz continuous. Based on an appropriate type of Lyapunov functional, sufficient passivity conditions for the DRNNs are derived in terms of a family of linear matrix inequalities (LMIs). Two numerical examples are given to illustrate the effectiveness and applicability.


2013 ◽  
Vol 2013 ◽  
pp. 1-7
Author(s):  
Wenguang Luo ◽  
Xiuling Wang ◽  
Yonghua Liu ◽  
Hongli Lan

The problem of global exponential stability for recurrent neural networks with time-varying delay is investigated. By dividing the time delay interval [0,τ(t)] intoK+1dynamical subintervals, a new Lyapunov-Krasovskii functional is introduced; then, a novel linear-matrix-inequality (LMI-) based delay-dependent exponential stability criterion is derived, which is less conservative than some previous literatures (Zhang et al., 2005; He et al., 2006; and Wu et al., 2008). An illustrate example is finally provided to show the effectiveness and the advantage of the proposed result.


Sign in / Sign up

Export Citation Format

Share Document