scholarly journals Exponential Synchronization for Stochastic Neural Networks with Mixed Time Delays and Markovian Jump Parameters via Sampled Data

2014 ◽  
Vol 2014 ◽  
pp. 1-17
Author(s):  
Yingwei Li ◽  
Xueqing Guo

The exponential synchronization issue for stochastic neural networks (SNNs) with mixed time delays and Markovian jump parameters using sampled-data controller is investigated. Based on a novel Lyapunov-Krasovskii functional, stochastic analysis theory, and linear matrix inequality (LMI) approach, we derived some novel sufficient conditions that guarantee that the master systems exponentially synchronize with the slave systems. The design method of the desired sampled-data controller is also proposed. To reflect the most dynamical behaviors of the system, both Markovian jump parameters and stochastic disturbance are considered, where stochastic disturbances are given in the form of a Brownian motion. The results obtained in this paper are a little conservative comparing the previous results in the literature. Finally, two numerical examples are given to illustrate the effectiveness of the proposed methods.

2010 ◽  
Vol 88 (12) ◽  
pp. 885-898 ◽  
Author(s):  
R. Raja ◽  
R. Sakthivel ◽  
S. Marshal Anthoni

This paper investigates the stability issues for a class of discrete-time stochastic neural networks with mixed time delays and impulsive effects. By constructing a new Lyapunov–Krasovskii functional and combining with the linear matrix inequality (LMI) approach, a novel set of sufficient conditions are derived to ensure the global asymptotic stability of the equilibrium point for the addressed discrete-time neural networks. Then the result is extended to address the problem of robust stability of uncertain discrete-time stochastic neural networks with impulsive effects. One important feature in this paper is that the stability of the equilibrium point is proved under mild conditions on the activation functions, and it is not required to be differentiable or strictly monotonic. In addition, two numerical examples are provided to show the effectiveness of the proposed method, while being less conservative.


2020 ◽  
Vol 25 (5) ◽  
Author(s):  
Iswarya Manickam ◽  
Raja Ramachandran ◽  
Grienggrai Rajchakit ◽  
Jinde Cao ◽  
Chuangxia Huang

This paper concerns the issues of exponential stability in Lagrange sense for a class of stochastic Cohen–Grossberg neural networks (SCGNNs) with Markovian jump and mixed time delay effects. A systematic approach of constructing a global Lyapunov function for SCGNNs with mixed time delays and Markovian jumping is provided by applying the association of Lyapunov method and graph theory results. Moreover, by using some inequality techniques in Lyapunov-type and coefficient-type theorems we attain two kinds of sufficient conditions to ensure the global exponential stability (GES) through Lagrange sense for the addressed SCGNNs. Ultimately, some examples with numerical simulations are given to demonstrate the effectiveness of the acquired result.


2014 ◽  
Vol 2014 ◽  
pp. 1-13
Author(s):  
Yingwei Li ◽  
Huaiqin Wu

The exponential stability issue for a class of stochastic neural networks (SNNs) with Markovian jump parameters, mixed time delays, andα-inverse Hölder activation functions is investigated. The jumping parameters are modeled as a continuous-time finite-state Markov chain. Firstly, based on Brouwer degree properties, the existence and uniqueness of the equilibrium point for SNNs without noise perturbations are proved. Secondly, by applying the Lyapunov-Krasovskii functional approach, stochastic analysis theory, and linear matrix inequality (LMI) technique, new delay-dependent sufficient criteria are achieved in terms of LMIs to ensure the SNNs with noise perturbations to be globally exponentially stable in the mean square. Finally, two simulation examples are provided to demonstrate the validity of the theoretical results.


Sign in / Sign up

Export Citation Format

Share Document