scholarly journals Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks

Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 726 ◽  
Author(s):  
Giorgio Gosti ◽  
Viola Folli ◽  
Marco Leonetti ◽  
Giancarlo Ruocco

In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14 N , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14 N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.

Author(s):  
Olga RUZAKOVA

The article presents a methodological approach to assessing the investment attractiveness of an enterprise based on the Hopfield neural network mathematical apparatus. An extended set of evaluation parameters of the investment process has been compiled. An algorithm for formalizing the decision-making process regarding the investment attractiveness of the enterprise based on the mathematical apparatus of neural networks has been developed. The proposed approach allows taking into account the constantly changing sets of quantitative and qualitative parameters, identifying the appropriate level of investment attractiveness of the enterprise with minimal money and time expenses – one of the standards of the Hopfield network, which is most similar to the one that characterizes the activity of the enterprise. Developed complex formalization of the investment process allows you to make investment decisions in the context of incompleteness and heterogeneity of information, based on the methodological tools of neural networks.


2019 ◽  
Vol 8 (2) ◽  
pp. 4928-4937 ◽  

Odia character and digits recognition area are vital issues of these days in computer vision. In this paper a Hope field neural network design to solve the printed Odia character recognition has been discussed. Optical Character Recognition (OCR) is the principle of applying conversion of the pictures from handwritten, printed or typewritten to machine encoded text version. Artificial Neural Networks (ANNs) trained as a classifier and it had been trained, supported the rule of Hopfield Network by exploitation code designed within the MATLAB. Preprocessing of data (image acquisition, binarization, skeletonization, skew detection and correction, image cropping, resizing, implementation and digitalization) all these activities have been carried out using MATLAB. The OCR, designed a number of the thought accuses non-standard speech for different types of languages. Segmentation, feature extraction, classification tasks is the well-known techniques for reviewing of Odia characters and outlined with their weaknesses, relative strengths. It is expected that who are interested to figure within the field of recognition of Odia characters are described in this paper. Recognition of Odia printed characters, numerals, machine characters of research areas finds costly applications within the banks, industries, offices. In this proposed work we devolve an efficient and robust mechanism in which Odia characters are recognized by the Hopfield Neural Networks (HNN).


2021 ◽  
pp. 1-15
Author(s):  
Masaki Kobayashi

Abstract A complex-valued Hopfield neural network (CHNN) is a multistate Hopfield model. A quaternion-valued Hopfield neural network (QHNN) with a twin-multistate activation function was proposed to reduce the number of weight parameters of CHNN. Dual connections (DCs) are introduced to the QHNNs to improve the noise tolerance. The DCs take advantage of the noncommutativity of quaternions and consist of two weights between neurons. A QHNN with DCs provides much better noise tolerance than a CHNN. Although a CHNN and a QHNN with DCs have the samenumber of weight parameters, the storage capacity of projection rule for QHNNs with DCs is half of that for CHNNs and equals that of conventional QHNNs. The small storage capacity of QHNNs with DCs is caused by projection rule, not the architecture. In this work, the ebbian rule is introduced and proved by stochastic analysis that the storage capacity of a QHNN with DCs is 0.8 times as many as that of a CHNN.


2018 ◽  
Vol 2018 ◽  
pp. 1-5 ◽  
Author(s):  
Masaki Kobayashi

A twin-multistate quaternion Hopfield neural network (TMQHNN) is a multistate Hopfield model and can store multilevel information, such as image data. Storage capacity is an important problem of Hopfield neural networks. Jankowski et al. approximated the crosstalk terms of complex-valued Hopfield neural networks (CHNNs) by the 2-dimensional normal distributions and evaluated their storage capacities. In this work, we evaluate the storage capacities of TMQHNNs based on their idea.


2020 ◽  
Vol 8 (5) ◽  
pp. 4934-4938

In this research paper, we will train and test the Hopfield neural network for recognizing QR codes. We propose an algorithm for denoising QR codes using the concept of parallel Hopfield neural network. One of the biggest drawbacks of the noisy QR code is its poor performance and low storage capacity. Using Hopfield we can easily denoise the QR code and thereby increasing the storage capacity


2010 ◽  
Vol 97-101 ◽  
pp. 3514-3518
Author(s):  
Jun You Shi ◽  
Hong Yan Zhai ◽  
Chuang Sheng Su

An irregular parts optimal layout method based on artificial neural networks is proposed. The manufacturing process of parts is involved in the layout problem. Every side of shapes is expanded in consideration of the machining allowance. Self-Organizing Map (SOM) and Hopfield artificial neural network are integrated to complete the automatic layout. In the beginning, irregular parts are randomly distributed. Self-Organizing Map is used to look for the best position of the irregular parts by moving them. The overlapping area is gradually reduced to zero. Hopfield neural network is used to rotate each part, and each part's optimum rotating angle is obtained when the neural network is in stable state. The algorithm in this paper can solve the irregular parts layout problem and rectangular parts layout problem in the given region. Examples indicate that this algorithm is effective and practical.


1995 ◽  
Vol 06 (03) ◽  
pp. 317-357 ◽  
Author(s):  
M.B. SUKHASWAMI ◽  
P. SEETHARAMULU ◽  
ARUN K. PUJARI

The aim of the present work is to recognize printed and handwritten Telugu characters using artificial neural networks (ANNs). Earlier work on recognition of Telugu characters has been done using conventional pattern recognition techniques. We make an initial attempt here of using neural networks for recognition with the aim of improving upon earlier methods which do not perform effectively in the presence of noise and distortion in the characters. The Hopfield model of neural network working as an associative memory is chosen for recognition purposes initially. Due to limitation in the capacity of the Hopfield neural network, we propose a new scheme named here as the Multiple Neural Network Associative Memory (MNNAM). The limitation in storage capacity has been overcome by combining multiple neural networks which work in parallel. It is also demonstrated that the Hopfield network is suitable for recognizing noisy printed characters as well as handwritten characters written by different “hands” in a variety of styles. Detailed experiments have been carried out using several learning strategies and results are reported. It is shown here that satisfactory recognition is possible using the proposed strategy. A detailed preprocessing scheme of the Telugu characters from digitized documents is also described.


1989 ◽  
Vol 01 (02) ◽  
pp. 133-141 ◽  
Author(s):  
Michael R. Davenport ◽  
Geoffrey W. Hoffmann

This paper describes a process for adding hidden neurons to a fully recurrent Hopfield neural network in such a way as to optimize the orthogonality of the memory space. The process uses the network itself, operating with a "reverse update rule" to assign optimal values to the hidden neurons for each memory. The outer product rule is used to modify synaptic strengths as each new memory is added. As in a standard Hopfield network this is a fast process because it is noniterative. Tri-state hidden neurons, initially set to zero, are used in the recovery of memories. Simulations indicate that the storage capacity of the network for uncorrelated memories, and the radius of attraction of each memory, are significantly better than those of the standard Hopfield network. The use of hidden neurons permits flexibility in the network capacity for memories of a given length. The network is able to solve second-order hetero-associative problems, as illustrated with solutions to the XOR set of associations.


2013 ◽  
Vol 2013 ◽  
pp. 1-9 ◽  
Author(s):  
Xia Huang ◽  
Zhen Wang ◽  
Yuxia Li

A fractional-order two-neuron Hopfield neural network with delay is proposed based on the classic well-known Hopfield neural networks, and further, the complex dynamical behaviors of such a network are investigated. A great variety of interesting dynamical phenomena, including single-periodic, multiple-periodic, and chaotic motions, are found to exist. The existence of chaotic attractors is verified by the bifurcation diagram and phase portraits as well.


Sign in / Sign up

Export Citation Format

Share Document