Global Exponential Stability of Discrete Time Hopfield Neural Networks with Delays

Author(s):  
Qiang Zhang ◽  
Wenbing Liu ◽  
Xiaopeng Wei
2007 ◽  
Vol 2007 ◽  
pp. 1-9 ◽  
Author(s):  
Qiang Zhang ◽  
Xiaopeng Wei ◽  
Jin Xu

Global exponential stability of a class of discrete-time Hopfield neural networks with variable delays is considered. By making use of a difference inequality, a new global exponential stability result is provided. The result only requires the delay to be bounded. For this reason, the result is milder than those presented in the earlier references. Furthermore, two examples are given to show the efficiency of our result.


2009 ◽  
Vol 43 (1) ◽  
pp. 145-161 ◽  
Author(s):  
Sannay Mohamad ◽  
Haydar Akça ◽  
Valéry Covachev

Abstract A discrete-time analogue is formulated for an impulsive Cohen- -Grossberg neural network with transmission delay in a manner in which the global exponential stability characterisitics of a unique equilibrium point of the network are preserved. The formulation is based on extending the existing semidiscretization method that has been implemented for computer simulations of neural networks with linear stabilizing feedback terms. The exponential convergence in the p-norm of the analogue towards the unique equilibrium point is analysed by exploiting an appropriate Lyapunov sequence and properties of an M-matrix. The main result yields a Lyapunov exponent that involves the magnitude and frequency of the impulses. One can use the result for deriving the exponential stability of non-impulsive discrete-time neural networks, and also for simulating the exponential stability of impulsive and non-impulsive continuous-time networks.


Sign in / Sign up

Export Citation Format

Share Document