Exponential stability of some linear continuous time difference systems

2012 ◽  
Vol 61 (1) ◽  
pp. 62-68 ◽  
Author(s):  
D. Melchor-Aguilar
2014 ◽  
Vol 2014 ◽  
pp. 1-10
Author(s):  
Jinxiang Cai ◽  
Zhenkun Huang ◽  
Honghua Bin

We present stability analysis of delayed Wilson-Cowan networks on time scales. By applying the theory of calculus on time scales, the contraction mapping principle, and Lyapunov functional, new sufficient conditions are obtained to ensure the existence and exponential stability of periodic solution to the considered system. The obtained results are general and can be applied to discrete-time or continuous-time Wilson-Cowan networks.


2015 ◽  
Vol 2015 ◽  
pp. 1-15
Author(s):  
Juan Chen ◽  
Zhenkun Huang ◽  
Jinxiang Cai

We investigate a class of fuzzy neural networks with Hebbian-type unsupervised learning on time scales. By using Lyapunov functional method, some new sufficient conditions are derived to ensure learning dynamics and exponential stability of fuzzy networks on time scales. Our results are general and can include continuous-time learning-based fuzzy networks and corresponding discrete-time analogues. Moreover, our results reveal some new learning behavior of fuzzy synapses on time scales which are seldom discussed in the literature.


2009 ◽  
Vol 43 (1) ◽  
pp. 145-161 ◽  
Author(s):  
Sannay Mohamad ◽  
Haydar Akça ◽  
Valéry Covachev

Abstract A discrete-time analogue is formulated for an impulsive Cohen- -Grossberg neural network with transmission delay in a manner in which the global exponential stability characterisitics of a unique equilibrium point of the network are preserved. The formulation is based on extending the existing semidiscretization method that has been implemented for computer simulations of neural networks with linear stabilizing feedback terms. The exponential convergence in the p-norm of the analogue towards the unique equilibrium point is analysed by exploiting an appropriate Lyapunov sequence and properties of an M-matrix. The main result yields a Lyapunov exponent that involves the magnitude and frequency of the impulses. One can use the result for deriving the exponential stability of non-impulsive discrete-time neural networks, and also for simulating the exponential stability of impulsive and non-impulsive continuous-time networks.


Sign in / Sign up

Export Citation Format

Share Document