scholarly journals Model architecture for associative memory in a neural network of spiking neurons

2012 ◽  
Vol 391 (3) ◽  
pp. 843-848 ◽  
Author(s):  
Everton J. Agnes ◽  
Rubem Erichsen ◽  
Leonardo G. Brunnet
2014 ◽  
pp. 32-37
Author(s):  
Akira Imada

We are exploring a weight configuration space searching for solutions to make our neural network with spiking neurons do some tasks. For the task of simulating an associative memory model, we have already known one such solution — a weight configuration learned a set of patterns using Hebb’s rule, and we guess we have many others which we have not known so far. In searching for such solutions, we observed that the so-called fitness landscape was almost everywhere completely flatland of altitude zero in which the Hebbian weight configuration is the only unique peak, and in addition, the sidewall of the peak is not gradient at all. In such circumstances how could we search for the other peaks? This paper is a call for challenges to the problem.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1065
Author(s):  
Moshe Bensimon ◽  
Shlomo Greenberg ◽  
Moshe Haiut

This work presents a new approach based on a spiking neural network for sound preprocessing and classification. The proposed approach is biologically inspired by the biological neuron’s characteristic using spiking neurons, and Spike-Timing-Dependent Plasticity (STDP)-based learning rule. We propose a biologically plausible sound classification framework that uses a Spiking Neural Network (SNN) for detecting the embedded frequencies contained within an acoustic signal. This work also demonstrates an efficient hardware implementation of the SNN network based on the low-power Spike Continuous Time Neuron (SCTN). The proposed sound classification framework suggests direct Pulse Density Modulation (PDM) interfacing of the acoustic sensor with the SCTN-based network avoiding the usage of costly digital-to-analog conversions. This paper presents a new connectivity approach applied to Spiking Neuron (SN)-based neural networks. We suggest considering the SCTN neuron as a basic building block in the design of programmable analog electronics circuits. Usually, a neuron is used as a repeated modular element in any neural network structure, and the connectivity between the neurons located at different layers is well defined. Thus, generating a modular Neural Network structure composed of several layers with full or partial connectivity. The proposed approach suggests controlling the behavior of the spiking neurons, and applying smart connectivity to enable the design of simple analog circuits based on SNN. Unlike existing NN-based solutions for which the preprocessing phase is carried out using analog circuits and analog-to-digital conversion, we suggest integrating the preprocessing phase into the network. This approach allows referring to the basic SCTN as an analog module enabling the design of simple analog circuits based on SNN with unique inter-connections between the neurons. The efficiency of the proposed approach is demonstrated by implementing SCTN-based resonators for sound feature extraction and classification. The proposed SCTN-based sound classification approach demonstrates a classification accuracy of 98.73% using the Real-World Computing Partnership (RWCP) database.


1987 ◽  
Vol 34 (7) ◽  
pp. 1553-1556 ◽  
Author(s):  
R.E. Howard ◽  
D.B. Schwartz ◽  
J.S. Denker ◽  
R.W. Epworth ◽  
H.P. Graf ◽  
...  

2012 ◽  
Vol 2012 ◽  
pp. 1-19 ◽  
Author(s):  
Xiaofang Hu ◽  
Shukai Duan ◽  
Lidan Wang

Chaotic Neural Network, also denoted by the acronym CNN, has rich dynamical behaviors that can be harnessed in promising engineering applications. However, due to its complex synapse learning rules and network structure, it is difficult to update its synaptic weights quickly and implement its large scale physical circuit. This paper addresses an implementation scheme of a novel CNN with memristive neural synapses that may provide a feasible solution for further development of CNN. Memristor, widely known as the fourth fundamental circuit element, was theoretically predicted by Chua in 1971 and has been developed in 2008 by the researchers in Hewlett-Packard Laboratory. Memristor based hybrid nanoscale CMOS technology is expected to revolutionize the digital and neuromorphic computation. The proposed memristive CNN has four significant features: (1) nanoscale memristors can simplify the synaptic circuit greatly and enable the synaptic weights update easily; (2) it can separate stored patterns from superimposed input; (3) it can deal with one-to-many associative memory; (4) it can deal with many-to-many associative memory. Simulation results are provided to illustrate the effectiveness of the proposed scheme.


Sign in / Sign up

Export Citation Format

Share Document