scholarly journals On the applicability of STDP-based learning mechanisms to spiking neuron network models

AIP Advances ◽  
2016 ◽  
Vol 6 (11) ◽  
pp. 111305 ◽  
Author(s):  
A. Sboev ◽  
D. Vlasov ◽  
A. Serenko ◽  
R. Rybka ◽  
I. Moloshnikov
2017 ◽  
Vol 12 (4) ◽  
pp. 109-124 ◽  
Author(s):  
S.A. Lobov ◽  
M.O. Zhuravlev ◽  
V.A. Makarov ◽  
V.B. Kazantsev

2014 ◽  
Vol 51 (3) ◽  
pp. 837-857
Author(s):  
K. Borovkov ◽  
G. Decrouez ◽  
M. Gilson

The paper deals with nonlinear Poisson neuron network models with bounded memory dynamics, which can include both Hebbian learning mechanisms and refractory periods. The state of the network is described by the times elapsed since its neurons fired within the post-synaptic transfer kernel memory span, and the current strengths of synaptic connections, the state spaces of our models being hierarchies of finite-dimensional components. We prove the ergodicity of the stochastic processes describing the behaviour of the networks, establish the existence of continuously differentiable stationary distribution densities (with respect to the Lebesgue measures of corresponding dimensionality) on the components of the state space, and find upper bounds for them. For the density components, we derive a system of differential equations that can be solved in a few simplest cases only. Approaches to approximate computation of the stationary density are discussed. One approach is to reduce the dimensionality of the problem by modifying the network so that each neuron cannot fire if the number of spikes it emitted within the post-synaptic transfer kernel memory span reaches a given threshold. We show that the stationary distribution of this ‘truncated’ network converges to that of the unrestricted network as the threshold increases, and that the convergence is at a superexponential rate. A complementary approach uses discrete Markov chain approximations to the network process.


2021 ◽  
Author(s):  
Yuntao Han ◽  
Tao Yu ◽  
Silu Cheng ◽  
Jiangtao Xu

<div> <div> <div> <div> <p>Spiking Neuron Network (SNN) has shown advantages in processing event-based data for image classification. However, the classification accuracy of SNNs decreases in noisy environment. The cascade spiking neuron network (cascade-SNN) was proposed to solve this problem in this letter. We used spiking convolutional spiking neuron network (SCNN) for features extraction and liquid state machine (LSM) for read out. Compared with early works on ANNs, this network achieved the state-of-the-art classification accuracy in DVS-CIFAR10 dataset and DVS-Gesture dataset, which are both challenging dataset because of noisy environment. We conducted ablation experiments to verify the proposed structure is effective and analyzed the influence of different hyper-parameters. </p> </div> </div> </div> </div>


Sign in / Sign up

Export Citation Format

Share Document