arithmetic networks
Recently Published Documents


TOTAL DOCUMENTS

11
(FIVE YEARS 1)

H-INDEX

4
(FIVE YEARS 0)

2021 ◽  
pp. 1-17
Author(s):  
Anna A. Matejko ◽  
Daniel Ansari

Abstract Visuospatial working memory (VSWM) plays an important role in arithmetic problem solving, and the relationship between these two skills is thought to change over development. Even though neuroimaging studies have demonstrated that VSWM and arithmetic both recruit frontoparietal networks, inferences about common neural substrates have largely been made by comparisons across studies. Little work has examined how brain activation for VSWM and arithmetic converge within the same participants and whether there are age-related changes in the overlap of these neural networks. In this study, we examined how brain activity for VSWM and arithmetic overlap in 38 children and 26 adults. Although both children and adults recruited the intraparietal sulcus (IPS) for VSWM and arithmetic, children showed more focal activation within the right IPS, whereas adults recruited the bilateral IPS, superior frontal sulcus/middle frontal gyrus, and right insula. A comparison of the two groups revealed that adults recruited a more left-lateralized network of frontoparietal regions for VSWM and arithmetic compared with children. Together, these findings suggest possible neurocognitive mechanisms underlying the strong relationship between VSWM and arithmetic and provide evidence that the association between VSWM and arithmetic networks changes with age.


2016 ◽  
Vol 26 (3) ◽  
pp. 687-715
Author(s):  
Andrei Gabrielov ◽  
Nicolai Vorobjov

2013 ◽  
Vol 21 (3) ◽  
pp. 361-387 ◽  
Author(s):  
Richard J. Preen ◽  
Larry Bull

A number of representation schemes have been presented for use within learning classifier systems, ranging from binary encodings to artificial neural networks. This paper presents results from an investigation into using a temporally dynamic symbolic representation within the XCSF learning classifier system. In particular, dynamical arithmetic networks are used to represent the traditional condition-action production system rules to solve continuous-valued reinforcement learning problems and to perform symbolic regression, finding competitive performance with traditional genetic programming on a number of composite polynomial tasks. In addition, the network outputs are later repeatedly sampled at varying temporal intervals to perform multistep-ahead predictions of a financial time series.


1999 ◽  
Vol 11 (3) ◽  
pp. 715-745 ◽  
Author(s):  
Ricard Gavaldà ◽  
Hava T. Siegelmann

This article studies the computational power of various discontinuous real computational models that are based on the classical analog recurrent neural network (ARNN). This ARNN consists of finite number of neurons; each neuron computes a polynomial net function and a sigmoid-like continuous activation function. We introduce arithmetic networks as ARNN augmented with a few simple discontinuous (e.g., threshold or zero test) neurons. We argue that even with weights restricted to polynomial time computable reals, arithmetic networks are able to compute arbitrarily complex recursive functions. We identify many types of neural networks that are at least as powerful as arithmetic nets, some of which are not in fact discontinuous, but they boost other arithmetic operations in the net function (e.g., neurons that can use divisions and polynomial net functions inside sigmoid-like continuous activation functions). These arithmetic networks are equivalent to the Blum-Shub-Smale model, when the latter is restricted to a bounded number of registers. With respect to implementation on digital computers, we show that arithmetic networks with rational weights can be simulated with exponential precision, but even with polynomial-time computable real weights, arithmetic networks are not subject to any fixed precision bounds. This is in contrast with the ARNN that are known to demand precision that is linear in the computation time. When nontrivial periodic functions (e.g., fractional part, sine, tangent) are added to arithmetic networks, the resulting networks are computationally equivalent to a massively parallel machine. Thus, these highly discontinuous networks can solve the presumably intractable class of PSPACE-complete problems in polynomial time.


1996 ◽  
Vol 7 (1) ◽  
pp. 41-51 ◽  
Author(s):  
J. L. Montaña ◽  
J. E. Morais ◽  
Luis M. Pardo

Author(s):  
J. L. Montaña ◽  
J. E. Morais ◽  
Luis M. Pardo

1993 ◽  
Vol 4 (1) ◽  
pp. 1-24 ◽  
Author(s):  
J. L. Monta�a ◽  
L. M. Pardo

1975 ◽  
Author(s):  
Renato De Mori ◽  
Michele Elia ◽  
Angelo Serra

Sign in / Sign up

Export Citation Format

Share Document