Storage Capacity of the Hopfield Network Associative Memory

Author(s):  
Yue Wu ◽  
Jianqing Hu ◽  
Wei Wu ◽  
Yong Zhou ◽  
K.L. Du
2021 ◽  
Vol 11 (2) ◽  
Author(s):  
Brendan P. Marsh ◽  
Yudan Guo ◽  
Ronen M. Kroeze ◽  
Sarang Gopalakrishnan ◽  
Surya Ganguli ◽  
...  

Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 726 ◽  
Author(s):  
Giorgio Gosti ◽  
Viola Folli ◽  
Marco Leonetti ◽  
Giancarlo Ruocco

In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14 N , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14 N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.


2017 ◽  
Vol 168 (2) ◽  
pp. 288-299 ◽  
Author(s):  
Mete Demircigil ◽  
Judith Heusel ◽  
Matthias Löwe ◽  
Sven Upgang ◽  
Franck Vermet

1999 ◽  
Vol 09 (04) ◽  
pp. 351-370 ◽  
Author(s):  
M. SREENIVASA RAO ◽  
ARUN K. PUJARI

A new paradigm of neural network architecture is proposed that works as associative memory along with capabilities of pruning and order-sensitive learning. The network has a composite structure wherein each node of the network is a Hopfield network by itself. The Hopfield network employs an order-sensitive learning technique and converges to user-specified stable states without having any spurious states. This is based on geometrical structure of the network and of the energy function. The network is so designed that it allows pruning in binary order as it progressively carries out associative memory retrieval. The capacity of the network is 2n, where n is the number of basic nodes in the network. The capabilities of the network are demonstrated by experimenting on three different application areas, namely a Library Database, a Protein Structure Database and Natural Language Understanding.


Author(s):  
Tet Yeap

A trainable analog restricted Hopfield Network is presented in this paper. It consists of two layers of nodes, visible and hidden nodes, connected by weighted directional paths forming a bipartite graph with no intralayer connection. An energy or Lyapunov function was derived to show that the proposed network will converge to stable states. The proposed network can be trained using either the modified SPSA or BPTT algorithms to ensure that all the weights are symmetric. Simulation results show that the presence of hidden nodes increases the network’s memory capacity. Using EXOR as an example, the network can be trained to be a dynamic classifier. Using A, U, T, S as training characters, the network was trained to be an associative memory. Simulation results show that the network can perform perfect re-creation of noisy images. Its recreation performance has higher noise tolerance than the standard Hopfield Network and the Restricted Boltzmann Machine. Simulation results also illustrate the importance of feedback iteration in implementing associative memory to re-create from noisy images.


Sign in / Sign up

Export Citation Format

Share Document