scholarly journals Time optimal information transfer in spintronics networks

Author(s):  
Frank C Langbein ◽  
Sophie Schirmer ◽  
Edmond Jonckheere
2017 ◽  
Author(s):  
Himadri S. Samanta ◽  
Michael Hinczewski ◽  
D. Thirumalai

AbstractSignaling in enzymatic networks is typically triggered by environmental fluctuations, resulting in a series of stochastic chemical reactions, leading to corruption of the signal by noise. For example, information flow is initiated by binding of extracellular ligands to receptors, which is transmitted through a cascade involving kinase-phosphatase stochastic chemical reactions. For a class of such networks, we develop a general field-theoretic approach in order to calculate the error in signal transmission as a function of an appropriate control variable. Application of the theory to a simple push-pull network, a module in the kinase-phosphatase cascade, recovers the exact results for error in signal transmission previously obtained using umbral calculus (Phys. Rev. X.,4, 041017 (2014)). We illustrate the generality of the theory by studying the minimal errors in noise reduction in a reaction cascade with two connected push-pull modules. Such a cascade behaves as an effective three-species network with a pseudo intermediate. In this case, optimal information transfer, resulting in the smallest square of the error between the input and output, occurs with a time delay, which is given by the inverse of the decay rate of the pseudo intermediate. Surprisingly, in these examples the minimum error computed using simulations that take non-linearities and discrete nature of molecules into account coincides with the predictions of a linear theory. In contrast, there are substantial deviations between simulations and predictions of the linear theory in error in signal propagation in an enzymatic push-pull network for a certain range of parameters. Inclusion of second order perturbative corrections shows that differences between simulations and theoretical predictions are minimized. Our study establishes that a field theoretic formulation of stochastic biological signaling offers a systematic way to understand error propagation in networks of arbitrary complexity.


1996 ◽  
Vol 8 (4) ◽  
pp. 757-771 ◽  
Author(s):  
H.-U. Bauer ◽  
R. Der

The magnification exponents μ occurring in adaptive map formation algorithms like Kohonen's self-organizing feature map deviate for the information theoretically optimal value μ = 1 as well as from the values that optimize, e.g., the mean square distortion error (μ = 1/3 for one-dimensional maps). At the same time, models for categorical perception such as the "perceptual magnet" effect, which are based on topographic maps, require negative magnification exponents μ < 0. We present an extension of the self-organizing feature map algorithm, which utilizes adaptive local learning step sizes to actually control the magnification properties of the map. By change of a single parameter, maps with optimal information transfer, with various minimal reconstruction errors, or with an inverted magnification can be generated. Analytic results on this new algorithm are complemented by numerical simulations.


2021 ◽  
Vol 17 (4) ◽  
pp. e1008897
Author(s):  
Kai Röth ◽  
Shuai Shao ◽  
Julijana Gjorgjieva

Sensory organs transmit information to downstream brain circuits using a neural code comprised of spikes from multiple neurons. According to the prominent efficient coding framework, the properties of sensory populations have evolved to encode maximum information about stimuli given biophysical constraints. How information coding depends on the way sensory signals from multiple channels converge downstream is still unknown, especially in the presence of noise which corrupts the signal at different points along the pathway. Here, we calculated the optimal information transfer of a population of nonlinear neurons under two scenarios. First, a lumped-coding channel where the information from different inputs converges to a single channel, thus reducing the number of neurons. Second, an independent-coding channel when different inputs contribute independent information without convergence. In each case, we investigated information loss when the sensory signal was corrupted by two sources of noise. We determined critical noise levels at which the optimal number of distinct thresholds of individual neurons in the population changes. Comparing our system to classical physical systems, these changes correspond to first- or second-order phase transitions for the lumped- or the independent-coding channel, respectively. We relate our theoretical predictions to coding in a population of auditory nerve fibers recorded experimentally, and find signatures of efficient coding. Our results yield important insights into the diverse coding strategies used by neural populations to optimally integrate sensory stimuli in the presence of distinct sources of noise.


2020 ◽  
Author(s):  
Kai Röth ◽  
Shuai Shao ◽  
Julijana Gjorgjieva

AbstractSensory organs transmit information to downstream brain circuits using a neural code comprised of spikes from multiple neurons. According to the prominent efficient coding framework, the properties of sensory populations have evolved to encode maximum information about stimuli given biophysical constraints. How information coding depends on the way sensory signals from multiple channels converge downstream is still unknown, especially in the presence of noise which corrupts the signal at different points along the pathway. Here, we calculated the optimal information transfer of a population of nonlinear neurons under two scenarios. First, a lumped-coding channel where the information from different inputs converges to a single channel, thus reducing the number of neurons. Second, an independent-coding channel when different inputs contribute independent information without convergence. In each case, we investigated information loss when the sensory signal was corrupted by two sources of noise. We determined critical noise levels at which the optimal number of distinct thresholds of individual neurons in the population changes. Comparing our system to classical physical systems, these changes correspond to first- or second-order phase transitions for the lumped- or the independent-coding channel, respectively. We relate our theoretical predictions to coding in a population of auditory nerve fibers recorded experimentally, and find signatures of efficient coding. Our results yield important insights into the diverse coding strategies used by neural populations to optimally integrate sensory stimuli in the presence of distinct sources of noise.


2001 ◽  
Vol 38-40 ◽  
pp. 397-402 ◽  
Author(s):  
P.H.E. Tiesinga ◽  
J.-M. Fellous ◽  
J.V. José ◽  
T.J. Sejnowski

2017 ◽  
Vol 96 (1) ◽  
Author(s):  
Himadri S. Samanta ◽  
Michael Hinczewski ◽  
D. Thirumalai

Sign in / Sign up

Export Citation Format

Share Document