scholarly journals Recurrent Network Models of Sequence Generation and Memory

Neuron ◽  
2016 ◽  
Vol 90 (1) ◽  
pp. 128-142 ◽  
Author(s):  
Kanaka Rajan ◽  
Christopher D. Harvey ◽  
David W. Tank
Author(s):  
Teijiro Isokawa ◽  
Nobuyuki Matsui ◽  
Haruhiko Nishimura

Quaternions are a class of hypercomplex number systems, a four-dimensional extension of imaginary numbers, which are extensively used in various fields such as modern physics and computer graphics. Although the number of applications of neural networks employing quaternions is comparatively less than that of complex-valued neural networks, it has been increasing recently. In this chapter, the authors describe two types of quaternionic neural network models. One type is a multilayer perceptron based on 3D geometrical affine transformations by quaternions. The operations that can be performed in this network are translation, dilatation, and spatial rotation in three-dimensional space. Several examples are provided in order to demonstrate the utility of this network. The other type is a Hopfield-type recurrent network whose parameters are directly encoded into quaternions. The stability of this network is demonstrated by proving that the energy decreases monotonically with respect to the change in neuron states. The fundamental properties of this network are presented through the network with three neurons.


Author(s):  
RAYMOND S. T. LEE ◽  
JAMES N. K. LIU

Financial prediction is one of the most typical applications in contemporary scientific study. In this paper, we present a fully integrated stock prediction system – NORN Predictor – a Neural Oscillatory-based Recurrent Network for finance prediction system to provide both a) long-term trend prediction, and b) short-term stock price prediction. One of the major characteristics of the proposed system is the automation of the conventional financial technical analysis technique such as market pattern analysis via the NOEGM (Neural Oscillatory-based Elastic Graph Matching) model and its integration with the Time-difference recurrent neural network models. This will provide a fully integrated and automated tool for analysis and investigation of stock investment. From the implementation point of view, the stock pricing information of 33 major Hong Kong stocks in the period from 1990 to 1999 is being adopted for system training and evaluation. As compared with the contemporary neural prediction model, the proposed system has achieved challenging results in terms of efficiency and accuracy.


2020 ◽  
Vol 34 (07) ◽  
pp. 11117-11124
Author(s):  
Wenhao Jiang ◽  
Lin Ma ◽  
Wei Lu

Depth has been shown beneficial to neural network models. In this paper, we make an attempt to make the encoder-decoder model deeper for sequence generation. We propose a module that can be plugged into the middle between the encoder and decoder to increase the depth of the whole model. The proposed module follows a nested structure, which is divided into blocks with each block containing several recurrent transition steps. To reduce the training difficulty and preserve the necessary information for the decoder during transitions, inter-block connections and intra-block connections are constructed in our model. The inter-block connections provide the thought vectors from the current block to all the subsequent blocks. The intra-block connections connect all the hidden states entering the current block to the current transition step. The advantages of our model are illustrated on the image captioning and code captioning tasks.


2018 ◽  
Vol 8 (11) ◽  
pp. 2018 ◽  
Author(s):  
Setu Shah ◽  
Zina Ben Miled ◽  
Rebecca Schaefer ◽  
Steve Berube

Predicting water demands is becoming increasingly critical because of the scarcity of this natural resource. In fact, the subject was the focus of numerous studies by a large number of researchers around the world. Several models have been proposed that are able to predict water demands using both statistical and machine learning techniques. These models have successfully identified features that can impact water demand trends for rural and metropolitan areas. However, while the above models, including recurrent network models proposed by the authors are able to predict normal water demands, most have difficulty estimating potential deviations from the norms. Outliers in water demand can be due to various reasons including high temperatures and voluntary or mandatory consumption restrictions by the water utility companies. Estimating these deviations is necessary, especially for water utility companies with a small service footprint, in order to efficiently plan water distribution. This paper proposes a differential learning model that can help model both over-consumption and under-consumption. The proposed differential model builds on a previously proposed recurrent neural network model that was successfully used to predict water demand in central Indiana.


1998 ◽  
Vol 21 (5) ◽  
pp. 640-641 ◽  
Author(s):  
Robert M. French ◽  
Elizabeth Thomas

What new implications does the dynamical hypothesis have for cognitive science? The short answer is: none. The target article is basically an attack on traditional symbolic artificial intelligence (AI) and differs very little from prior connectionist criticisms of it. For the past 10 years, the connectionist community has been well aware of the necessity of using (and understanding) dynamically evolving, recurrent network models of cognition.


Author(s):  
Hengguan Huang ◽  
Hao Wang ◽  
Brian Mak

Over the past few years, there has been a resurgence of interest in using recurrent neural network-hidden Markov model (RNN-HMM) for automatic speech recognition (ASR). Some modern recurrent network models, such as long shortterm memory (LSTM) and simple recurrent unit (SRU), have demonstrated promising results on this task. Recently, several scientific perspectives in the fields of neuroethology and speech production suggest that human speech signals may be represented in discrete point patterns involving acoustic events in the speech signal. Based on this hypothesis, it may pose some challenges for RNN-HMM acoustic modeling: firstly, it arbitrarily discretizes the continuous input into the interval features at a fixed frame rate, which may introduce discretization errors; secondly, the occurrences of such acoustic events are unknown. Furthermore, the training targets of RNN-HMM are obtained from other (inferior) models, giving rise to misalignments. In this paper, we propose a recurrent Poisson process (RPP) which can be seen as a collection of Poisson processes at a series of time intervals in which the intensity evolves according to the RNN hidden states that encode the history of the acoustic signal. It aims at allocating the latent acoustic events in continuous time. Such events are efficiently drawn from the RPP using a sampling-free solution in an analytic form. The speech signal containing latent acoustic events is reconstructed/sampled dynamically from the discretized acoustic features using linear interpolation, in which the weight parameters are estimated from the onset of these events. The above processes are further integrated into an SRU, forming our final model, called recurrent Poisson process unit (RPPU). Experimental evaluations on ASR tasks including ChiME-2, WSJ0 and WSJ0&1 demonstrate the effectiveness and benefits of the RPPU. For example, it achieves a relative WER reduction of 10.7% over state-of-the-art models on WSJ0.


Sign in / Sign up

Export Citation Format

Share Document