On the dynamics of discrete-time, continuous-state Hopfield neural networks

Author(s):  
Lipo Wang
2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Yanxia Sun ◽  
Zenghui Wang ◽  
Barend Jacobus van Wyk

A new neural network based optimization algorithm is proposed. The presented model is a discrete-time, continuous-state Hopfield neural network and the states of the model are updated synchronously. The proposed algorithm combines the advantages of traditional PSO, chaos and Hopfield neural networks: particles learn from their own experience and the experiences of surrounding particles, their search behavior is ergodic, and convergence of the swarm is guaranteed. The effectiveness of the proposed approach is demonstrated using simulations and typical optimization problems.


2020 ◽  
Vol 122 ◽  
pp. 54-67 ◽  
Author(s):  
Fidelis Zanetti de Castro ◽  
Marcos Eduardo Valle

2007 ◽  
Vol 2007 ◽  
pp. 1-9 ◽  
Author(s):  
Qiang Zhang ◽  
Xiaopeng Wei ◽  
Jin Xu

Global exponential stability of a class of discrete-time Hopfield neural networks with variable delays is considered. By making use of a difference inequality, a new global exponential stability result is provided. The result only requires the delay to be bounded. For this reason, the result is milder than those presented in the earlier references. Furthermore, two examples are given to show the efficiency of our result.


Sign in / Sign up

Export Citation Format

Share Document