scholarly journals Relating causal and probabilistic approaches to contextuality

Author(s):  
Matt Jones

A primary goal in recent research on contextuality has been to extend this concept to cases of inconsistent connectedness, where observables have different distributions in different contexts. This article proposes a solution within the framework of probabi- listic causal models, which extend hidden-variables theories, and then demonstrates an equivalence to the contextuality-by-default (CbD) framework. CbD distinguishes contextuality from direct influences of context on observables, defining the latter purely in terms of probability distributions. Here, we take a causal view of direct influences, defining direct influence within any causal model as the probability of all latent states of the system in which a change of context changes the outcome of a measurement. Model-based contextuality (M-contextuality) is then defined as the necessity of stronger direct influences to model a full system than when considered individually. For consistently connected systems, M-contextuality agrees with standard contextuality. For general systems, it is proved that M-contextuality is equivalent to the property that any model of a system must contain ‘hidden influences’, meaning direct influences that go in opposite directions for different latent states, or equivalently signalling between observers that carries no information. This criterion can be taken as formalizing the ‘no-conspiracy’ principle that has been proposed in connection with CbD. M-contextuality is then proved to be equivalent to CbD-contextuality, thus providing a new interpretation of CbD-contextuality as the non-existence of a model for a system without hidden direct influences. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.

Author(s):  
Ehtibar N. Dzhafarov

This paper deals with three traditional ways of defining contextuality: (C1) in terms of (non)existence of certain joint distributions involving measurements made in several mutually exclusive contexts; (C2) in terms of relationship between factual measurements in a given context and counterfactual measurements that could be made if one used other contexts; and (C3) in terms of (non)existence of ‘hidden variables’ that determine the outcomes of all factually performed measurements. It is generally believed that the three meanings are equivalent, but the issues involved are not entirely transparent. Thus, arguments have been offered that C2 may have nothing to do with C1, and the traditional formulation of C1 itself encounters difficulties when measurement outcomes in a contextual system are treated as random variables. I show that if C1 is formulated within the framework of the Contextuality-by-Default (CbD) theory, the notion of a probabilistic coupling, the core mathematical tool of CbD, subsumes both counterfactual values and ‘hidden variables’. In the latter case, a coupling itself can be viewed as a maximally parsimonious choice of a hidden variable. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.


Author(s):  
Gregg Jaeger

The origin and basis of the notion of quantum contextuality is identified in the Copenhagen approach to quantum mechanics, where context is automatically invoked by its requirement that the experimental arrangement involved in any measurements or set of measurements be taken into account while, in general, the outcome of a measurement may depend on other measurements immediately preceding or jointly performed on the same system. For Bohr, the specification of the experimental situation of any measurement is essential to its significance in light of complementarity and the omnipresence of the quantum of action in physics; for Heisenberg, the incompatibility of pairs of sharp measurements belonging to different situations coheres with both the completeness of the quantum state as an objective physical description and the principle of indeterminacy. Here, context in the Copenhagen approach is taken to be the equivalence class of experimental arrangements corresponding to a set of compatible measurements of quantum observables in standard quantum mechanics; the associated form of contextuality in quantum mechanics arises via the non-commutativity in general of sharp observables, proven by von Neumann, that can appear, providing different contexts. This notion is related to theoretical situations explored later by Bell, by Kochen and Specker, and by others in relation to the classification of hidden-variables theories and elsewhere in physics. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.


2018 ◽  
Author(s):  
John joseph Taylor

An interpretation of quantum mechanics involving multiple dimensions is proposed, as well as a thought experiment that in principle if performed correctly could either prove or disprove quantum randomness. All outcomes, of a particle’s wave function manifest but manifest in more than three dimensions, and when the wave function collapses, we see the outcome of the wave function, which only exist in three dimensions. Furthermore, a particle is a much larger object, and exists physically as a wave across more than three dimensions and our best description of this is the Schrodinger wave, because it only describes it in three dimensions. We cannot observe the particle as a wave because it is spread out as an object in which most of it exists in more than three dimensions, but when we observe the part or outcome of a wave function that does exist in three dimensions, which is when collapse occurs it leads to particle like properties, due to not being able to interact with the rest of the wave because it is confined to just interacting on a three dimensional scale because we are observing it in three dimensions. Furthermore we cannot observe the part of the wave function that exists in more than three dimensions, in three dimensions because of the principle that in order to observe an object in it's entirety it needs to be observed in all of it's dimensions. Strange phenomenon in quantum mechanics such as tunneling, can be explained by saying that there is a probability of finding the part of wave function that exists in three dimensions on the other side of the barrier, which has travelled over that barrier classically and the probability of it travelling over the barrier decreases expontentially to the width of the barrier increasing. Whether the quantum world is random, or is determined by non-local hidden variables, can be determined by a simple deductive thought experiment as outlined in this article.


1978 ◽  
Vol 43 (1) ◽  
pp. 65-72 ◽  
Author(s):  
A. Baracca ◽  
A. Cornia ◽  
R. Livi ◽  
S. Ruffo

1995 ◽  
Vol 1 (2) ◽  
pp. 163-190 ◽  
Author(s):  
Kenneth W. Church ◽  
William A. Gale

AbstractShannon (1948) showed that a wide range of practical problems can be reduced to the problem of estimating probability distributions of words and ngrams in text. It has become standard practice in text compression, speech recognition, information retrieval and many other applications of Shannon's theory to introduce a “bag-of-words” assumption. But obviously, word rates vary from genre to genre, author to author, topic to topic, document to document, section to section, and paragraph to paragraph. The proposed Poisson mixture captures much of this heterogeneous structure by allowing the Poisson parameter θ to vary over documents subject to a density function φ. φ is intended to capture dependencies on hidden variables such genre, author, topic, etc. (The Negative Binomial is a well-known special case where φ is a Г distribution.) Poisson mixtures fit the data better than standard Poissons, producing more accurate estimates of the variance over documents (σ2), entropy (H), inverse document frequency (IDF), and adaptation (Pr(x ≥ 2/x ≥ 1)).


2021 ◽  
Vol 376 (1821) ◽  
pp. 20190765 ◽  
Author(s):  
Giovanni Pezzulo ◽  
Joshua LaPalme ◽  
Fallon Durant ◽  
Michael Levin

Nervous systems’ computational abilities are an evolutionary innovation, specializing and speed-optimizing ancient biophysical dynamics. Bioelectric signalling originated in cells' communication with the outside world and with each other, enabling cooperation towards adaptive construction and repair of multicellular bodies. Here, we review the emerging field of developmental bioelectricity, which links the field of basal cognition to state-of-the-art questions in regenerative medicine, synthetic bioengineering and even artificial intelligence. One of the predictions of this view is that regeneration and regulative development can restore correct large-scale anatomies from diverse starting states because, like the brain, they exploit bioelectric encoding of distributed goal states—in this case, pattern memories. We propose a new interpretation of recent stochastic regenerative phenotypes in planaria, by appealing to computational models of memory representation and processing in the brain. Moreover, we discuss novel findings showing that bioelectric changes induced in planaria can be stored in tissue for over a week, thus revealing that somatic bioelectric circuits in vivo can implement a long-term, re-writable memory medium. A consideration of the mechanisms, evolution and functionality of basal cognition makes novel predictions and provides an integrative perspective on the evolution, physiology and biomedicine of information processing in vivo . This article is part of the theme issue ‘Basal cognition: multicellularity, neurons and the cognitive lens’.


Sign in / Sign up

Export Citation Format

Share Document