Generating exponentially stable states for a Hopfield Neural Network

2018 ◽  
Vol 275 ◽  
pp. 358-365 ◽  
Author(s):  
Erick Cabrera ◽  
Humberto Sossa
Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 726 ◽  
Author(s):  
Giorgio Gosti ◽  
Viola Folli ◽  
Marco Leonetti ◽  
Giancarlo Ruocco

In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14 N , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14 N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.


2008 ◽  
Vol 18 (07) ◽  
pp. 2029-2037
Author(s):  
WEI WU ◽  
BAO TONG CUI ◽  
ZHIGANG ZENG

In this paper, the globally exponential stability of recurrent neural networks with continuously distributed delays is investigated. New theoretical results are presented in the presence of external stimuli. It is shown that the recurrent neural network is globally exponentially stable, and the estimated location of the equilibrium point can be obtained. As typical representatives, the Hopfield neural network (HNN) and the cellular neural network (CNN) are examined in detail. Comparison between our results and the previous results admits the improvement of our results.


2008 ◽  
Vol 18 (02) ◽  
pp. 135-145 ◽  
Author(s):  
TEIJIRO ISOKAWA ◽  
HARUHIKO NISHIMURA ◽  
NAOTAKE KAMIURA ◽  
NOBUYUKI MATSUI

Associative memory networks based on quaternionic Hopfield neural network are investigated in this paper. These networks are composed of quaternionic neurons, and input, output, threshold, and connection weights are represented in quaternions, which is a class of hypercomplex number systems. The energy function of the network and the Hebbian rule for embedding patterns are introduced. The stable states and their basins are explored for the networks with three neurons and four neurons. It is clarified that there exist at most 16 stable states, called multiplet components, as the degenerated stored patterns, and each of these states has its basin in the quaternionic networks.


2009 ◽  
Vol 29 (4) ◽  
pp. 1028-1031
Author(s):  
Wei-xin GAO ◽  
Xiang-yang MU ◽  
Nan TANG ◽  
Hong-liang YAN

Sign in / Sign up

Export Citation Format

Share Document