scholarly journals Pinning Synchronization of Delayed Neural Networks with Nonlinear Inner-Coupling

2011 ◽  
Vol 2011 ◽  
pp. 1-12 ◽  
Author(s):  
Yangling Wang ◽  
Jinde Cao

Without assuming the symmetry and irreducibility of the outer-coupling weight configuration matrices, we investigate the pinning synchronization of delayed neural networks with nonlinear inner-coupling. Some delay-dependent controlled stability criteria in terms of linear matrix inequality (LMI) are obtained. An example is presented to show the application of the criteria obtained in this paper.

2010 ◽  
Vol 2010 ◽  
pp. 1-19 ◽  
Author(s):  
Qiankun Song ◽  
Jinde Cao

The problems on global dissipativity and global exponential dissipativity are investigated for uncertain discrete-time neural networks with time-varying delays and general activation functions. By constructing appropriate Lyapunov-Krasovskii functionals and employing linear matrix inequality technique, several new delay-dependent criteria for checking the global dissipativity and global exponential dissipativity of the addressed neural networks are established in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. Illustrated examples are given to show the effectiveness of the proposed criteria. It is noteworthy that because neither model transformation nor free-weighting matrices are employed to deal with cross terms in the derivation of the dissipativity criteria, the obtained results are less conservative and more computationally efficient.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Xiaofeng Chen ◽  
Qiankun Song ◽  
Xiaohui Liu ◽  
Zhenjiang Zhao

The complex-valued neural networks with unbounded time-varying delays are considered. By constructing appropriate Lyapunov-Krasovskii functionals, and employing the free weighting matrix method, several delay-dependent criteria for checking the globalμ-stability of the addressed complex-valued neural networks are established in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. Two examples with simulations are given to show the effectiveness and less conservatism of the proposed criteria.


2007 ◽  
Vol 03 (03) ◽  
pp. 321-330 ◽  
Author(s):  
XU-YANG LOU ◽  
BAO-TONG CUI

The passivity conditions for stochastic neural networks with time-varying delays and random abrupt changes are considered in this paper. Sufficient conditions on passivity of stochastic neural networks with time-varying delays and random abrupt changes are developed in the linear matrix inequality (LMI) setting. The results obtained in this paper improve and extend some of the previous results.


2010 ◽  
Vol 2010 ◽  
pp. 1-14 ◽  
Author(s):  
Choon Ki Ahn

A new robust training law, which is called an input/output-to-state stable training law (IOSSTL), is proposed for dynamic neural networks with external disturbance. Based on linear matrix inequality (LMI) formulation, the IOSSTL is presented to not only guarantee exponential stability but also reduce the effect of an external disturbance. It is shown that the IOSSTL can be obtained by solving the LMI, which can be easily facilitated by using some standard numerical packages. Numerical examples are presented to demonstrate the validity of the proposed IOSSTL.


2012 ◽  
Vol 2012 ◽  
pp. 1-8 ◽  
Author(s):  
Yangfan Wang ◽  
Linshan Wang

This paper studies the problems of global exponential robust stability of high-order hopfield neural networks with time-varying delays. By employing a new Lyapunov-Krasovskii functional and linear matrix inequality, some criteria of global exponential robust stability for the high-order neural networks are established, which are easily verifiable and have a wider adaptive.


2015 ◽  
Vol 2015 ◽  
pp. 1-18 ◽  
Author(s):  
M. J. Park ◽  
O. M. Kwon ◽  
Ju H. Park ◽  
S. M. Lee ◽  
E. J. Cha

This paper considers the problem of delay-dependent state estimation for neural networks with time-varying delays and stochastic parameter uncertainties. It is assumed that the parameter uncertainties are affected by the environment which is changed with randomly real situation, and its stochastic information such as mean and variance is utilized in the proposed method. By constructing a newly augmented Lyapunov-Krasovskii functional, a designing method of estimator for neural networks is introduced with the framework of linear matrix inequalities (LMIs) and a neural networks model with stochastic parameter uncertainties which have not been introduced yet. Two numerical examples are given to show the improvements over the existing ones and the effectiveness of the proposed idea.


2014 ◽  
Vol 69 (1-2) ◽  
pp. 70-80 ◽  
Author(s):  
Mathiyalagan Kalidass ◽  
Hongye Su ◽  
Sakthivel Rathinasamy

This paper presents a robust analysis approach to stochastic stability of the uncertain Markovian jumping discrete-time neural networks (MJDNNs) with time delay in the leakage term. By choosing an appropriate Lyapunov functional and using free weighting matrix technique, a set of delay dependent stability criteria are derived. The stability results are delay dependent, which depend on not only the upper bounds of time delays but also their lower bounds. The obtained stability criteria are established in terms of linear matrix inequalities (LMIs) which can be effectively solved by some standard numerical packages. Finally, some illustrative numerical examples with simulation results are provided to demonstrate applicability of the obtained results. It is shown that even if there is no leakage delay, the obtained results are less restrictive than in some recent works.


Sign in / Sign up

Export Citation Format

Share Document