scholarly journals Recurrent Network Models for Perfect Temporal Integration of Fluctuating Correlated Inputs

2009 ◽  
Vol 5 (6) ◽  
pp. e1000404 ◽  
Author(s):  
Hiroshi Okamoto ◽  
Tomoki Fukai
Neuron ◽  
2016 ◽  
Vol 90 (1) ◽  
pp. 128-142 ◽  
Author(s):  
Kanaka Rajan ◽  
Christopher D. Harvey ◽  
David W. Tank

Author(s):  
Teijiro Isokawa ◽  
Nobuyuki Matsui ◽  
Haruhiko Nishimura

Quaternions are a class of hypercomplex number systems, a four-dimensional extension of imaginary numbers, which are extensively used in various fields such as modern physics and computer graphics. Although the number of applications of neural networks employing quaternions is comparatively less than that of complex-valued neural networks, it has been increasing recently. In this chapter, the authors describe two types of quaternionic neural network models. One type is a multilayer perceptron based on 3D geometrical affine transformations by quaternions. The operations that can be performed in this network are translation, dilatation, and spatial rotation in three-dimensional space. Several examples are provided in order to demonstrate the utility of this network. The other type is a Hopfield-type recurrent network whose parameters are directly encoded into quaternions. The stability of this network is demonstrated by proving that the energy decreases monotonically with respect to the change in neuron states. The fundamental properties of this network are presented through the network with three neurons.


Author(s):  
RAYMOND S. T. LEE ◽  
JAMES N. K. LIU

Financial prediction is one of the most typical applications in contemporary scientific study. In this paper, we present a fully integrated stock prediction system – NORN Predictor – a Neural Oscillatory-based Recurrent Network for finance prediction system to provide both a) long-term trend prediction, and b) short-term stock price prediction. One of the major characteristics of the proposed system is the automation of the conventional financial technical analysis technique such as market pattern analysis via the NOEGM (Neural Oscillatory-based Elastic Graph Matching) model and its integration with the Time-difference recurrent neural network models. This will provide a fully integrated and automated tool for analysis and investigation of stock investment. From the implementation point of view, the stock pricing information of 33 major Hong Kong stocks in the period from 1990 to 1999 is being adopted for system training and evaluation. As compared with the contemporary neural prediction model, the proposed system has achieved challenging results in terms of efficiency and accuracy.


2007 ◽  
Vol 97 (6) ◽  
pp. 3859-3867 ◽  
Author(s):  
Hiroshi Okamoto ◽  
Yoshikazu Isomura ◽  
Masahiko Takada ◽  
Tomoki Fukai

Temporal integration of externally or internally driven information is required for a variety of cognitive processes. This computation is generally linked with graded rate changes in cortical neurons, which typically appear during a delay period of cognitive task in the prefrontal and other cortical areas. Here, we present a neural network model to produce graded (climbing or descending) neuronal activity. Model neurons are interconnected randomly by AMPA-receptor–mediated fast excitatory synapses and are subject to noisy background excitatory and inhibitory synaptic inputs. In each neuron, a prolonged afterdepolarizing potential follows every spike generation. Then, driven by an external input, the individual neurons display bimodal rate changes between a baseline state and an elevated firing state, with the latter being sustained by regenerated afterdepolarizing potentials. When the variance of background input and the uniform weight of recurrent synapses are adequately tuned, we show that stochastic noise and reverberating synaptic input organize these bimodal changes into a sequence that exhibits graded population activity with a nearly constant slope. To test the validity of the proposed mechanism, we analyzed the graded activity of anterior cingulate cortex neurons in monkeys performing delayed conditional Go/No-go discrimination tasks. The delay-period activities of cingulate neurons exhibited bimodal activity patterns and trial-to-trial variability that are similar to those predicted by the proposed model.


2021 ◽  
Author(s):  
Roxana Zeraati ◽  
Yan-Liang Shi ◽  
Nicholas A Steinmetz ◽  
Marc A Gieselmann ◽  
Alexander Thiele ◽  
...  

Neural activity fluctuates endogenously on timescales varying across the neocortex. The variation in these intrinsic timescales relates to the functional specialization of cortical areas and their involvement in the temporal integration of information. Yet, it is unknown whether the timescales can adjust rapidly and selectively to the demands of a cognitive task. We measured intrinsic timescales of local spiking activity within columns of area V4 while monkeys performed spatial attention tasks. The ongoing spiking activity unfolded across at least two distinct timescales---fast and slow---and the slow timescale increased when monkeys attended to the receptive fields location. A recurrent network model shows that multiple timescales in local dynamics arise from spatial connectivity mimicking vertical and horizontal interactions in visual cortex and that slow timescales increase with the efficacy of recurrent interactions. Our results reveal that targeted neural populations integrate information over variable timescales following the demands of a cognitive task and propose an underlying network mechanism.


2018 ◽  
Vol 8 (11) ◽  
pp. 2018 ◽  
Author(s):  
Setu Shah ◽  
Zina Ben Miled ◽  
Rebecca Schaefer ◽  
Steve Berube

Predicting water demands is becoming increasingly critical because of the scarcity of this natural resource. In fact, the subject was the focus of numerous studies by a large number of researchers around the world. Several models have been proposed that are able to predict water demands using both statistical and machine learning techniques. These models have successfully identified features that can impact water demand trends for rural and metropolitan areas. However, while the above models, including recurrent network models proposed by the authors are able to predict normal water demands, most have difficulty estimating potential deviations from the norms. Outliers in water demand can be due to various reasons including high temperatures and voluntary or mandatory consumption restrictions by the water utility companies. Estimating these deviations is necessary, especially for water utility companies with a small service footprint, in order to efficiently plan water distribution. This paper proposes a differential learning model that can help model both over-consumption and under-consumption. The proposed differential model builds on a previously proposed recurrent neural network model that was successfully used to predict water demand in central Indiana.


2019 ◽  
Author(s):  
Erik Nygren ◽  
Alexandro Ramirez ◽  
Brandon McMahan ◽  
Emre Aksay ◽  
Walter Senn

AbstractThere has been much focus on the mechanisms of temporal integration, but little on how circuits learn to integrate. In the adult oculomotor system, where a neural integrator maintains fixations, changes in integration dynamics can be driven by visual error signals. However, we show through dark-rearing experiments that visual inputs are not necessary for initial integrator development. We therefore propose a vision-independent learning mechanism whereby a recurrent network learns to integrate via a ‘teaching’ signal formed by low-pass filtered feedback of its population activity. The key is the segregation of local recurrent inputs onto a dendritic compartment and teaching inputs onto a somatic compartment of an integrator neuron. Model instantiation for oculomotor control shows how a self-corrective teaching signal through the cerebellum can generate an integrator with both the dynamical and tuning properties necessary to drive eye muscles and maintain gaze angle. This bootstrap learning paradigm may be relevant for development and plasticity of temporal integration more generally.Highlights- A neuronal architecture that learns to integrate saccadic commands for eye position.- Learning is based on the recurrent dendritic prediction of somatic teaching signals.- Experiment and model show that no visual feedback is required for initial integrator learning.- Cerebellum is an internal teacher for motor nuclei and integrator population.


Sign in / Sign up

Export Citation Format

Share Document