scholarly journals On joint distributions, counterfactual values and hidden variables in understanding contextuality

Author(s):  
Ehtibar N. Dzhafarov

This paper deals with three traditional ways of defining contextuality: (C1) in terms of (non)existence of certain joint distributions involving measurements made in several mutually exclusive contexts; (C2) in terms of relationship between factual measurements in a given context and counterfactual measurements that could be made if one used other contexts; and (C3) in terms of (non)existence of ‘hidden variables’ that determine the outcomes of all factually performed measurements. It is generally believed that the three meanings are equivalent, but the issues involved are not entirely transparent. Thus, arguments have been offered that C2 may have nothing to do with C1, and the traditional formulation of C1 itself encounters difficulties when measurement outcomes in a contextual system are treated as random variables. I show that if C1 is formulated within the framework of the Contextuality-by-Default (CbD) theory, the notion of a probabilistic coupling, the core mathematical tool of CbD, subsumes both counterfactual values and ‘hidden variables’. In the latter case, a coupling itself can be viewed as a maximally parsimonious choice of a hidden variable. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.

2021 ◽  
Author(s):  
Tim C Jenkins

Abstract Superposed wavefunctions in quantum mechanics lead to a squared amplitude that introduces interference into a probability density, which has long been a puzzle because interference between probability densities exists nowhere else in probability theory. In recent years, Man’ko and coauthors have successfully reconciled quantum and classic probability using a symplectic tomographic model. Nevertheless, there remains an unexplained coincidence in quantum mechanics, namely, that mathematically, the interference term in the squared amplitude of superposed wavefunctions gives the squared amplitude the form of a variance of a sum of correlated random variables, and we examine whether there could be an archetypical variable behind quantum probability that provides a mathematical foundation that observes both quantum and classic probability directly. The properties that would need to be satisfied for this to be the case are identified, and a generic hidden variable that satisfies them is found that would be present everywhere, transforming into a process-specific variable wherever a quantum process is active. Uncovering this variable confirms the possibility that it could be the stochastic archetype of quantum probability.


Author(s):  
Matt Jones

A primary goal in recent research on contextuality has been to extend this concept to cases of inconsistent connectedness, where observables have different distributions in different contexts. This article proposes a solution within the framework of probabi- listic causal models, which extend hidden-variables theories, and then demonstrates an equivalence to the contextuality-by-default (CbD) framework. CbD distinguishes contextuality from direct influences of context on observables, defining the latter purely in terms of probability distributions. Here, we take a causal view of direct influences, defining direct influence within any causal model as the probability of all latent states of the system in which a change of context changes the outcome of a measurement. Model-based contextuality (M-contextuality) is then defined as the necessity of stronger direct influences to model a full system than when considered individually. For consistently connected systems, M-contextuality agrees with standard contextuality. For general systems, it is proved that M-contextuality is equivalent to the property that any model of a system must contain ‘hidden influences’, meaning direct influences that go in opposite directions for different latent states, or equivalently signalling between observers that carries no information. This criterion can be taken as formalizing the ‘no-conspiracy’ principle that has been proposed in connection with CbD. M-contextuality is then proved to be equivalent to CbD-contextuality, thus providing a new interpretation of CbD-contextuality as the non-existence of a model for a system without hidden direct influences. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.


Author(s):  
Guido Bacciagaluppi

The topic of probability in quantum mechanics is rather vast. In this chapter it is discussed from the perspective of whether and in what sense quantum mechanics requires a generalization of the usual (Kolmogorovian) concept of probability. The focus is on the case of finite-dimensional quantum mechanics (which is analogous to that of discrete probability spaces), partly for simplicity and partly for ease of generalization. While the main emphasis is on formal aspects of quantum probability (in particular the non-existence of joint distributions for incompatible observables), the discussion relates also to notorious issues in the interpretation of quantum mechanics. Indeed, whether quantum probability can or cannot be ultimately reduced to classical probability connects rather nicely to the question of 'hidden variables' in quantum mechanics.


2021 ◽  
Vol 4 (4) ◽  

Superposed wavefunctions in quantum mechanics lead to a squared amplitude that introduces interference into a probability density, which has long been a puzzle because interference between probability densities exists nowhere else in probability theory. In recent years, Man’ko and coauthors have successfully reconciled quantum and classic probability using a symplectic tomographic model. Nevertheless, there remains an unexplained coincidence in quantum mechanics, namely, that mathematically, the interference term in the squared amplitude of superposed wavefunctions gives the squared amplitude the form of a variance of a sum of correlated random variables, and we examine whether there could be an archetypical variable behind quantum probability that provides a mathematical foundation that observes both quantum and classic probability directly. The properties that would need to be satisfied for this to be the case are identified, and a generic hidden variable that satisfies them is found that would be present everywhere, transforming into a process-specific variable wherever a quantum process is active. Uncovering this variable confirms the possibility that it could be the stochastic archetype of quantum probability


Author(s):  
Janne V. Kujala ◽  
Ehtibar N. Dzhafarov

We discuss three measures of the degree of contextuality in contextual systems of dichotomous random variables. These measures are developed within the framework of the Contextuality-by-Default (CbD) theory, and apply to inconsistently connected systems (those with ‘disturbance’ allowed). For one of these measures of contextuality, presented here for the first time, we construct a corresponding measure of the degree of non-contextuality in non-contextual systems. The other two CbD-based measures do not suggest ways in which degree of non-contextuality of a non-contextual system can be quantified. We find the same to be true for the contextual fraction measure developed by Abramsky, Barbosa and Mansfield. This measure of contextuality is confined to consistently connected systems, but CbD allows one to generalize it to arbitrary systems. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.


2021 ◽  
Author(s):  
Tim C Jenkins

Abstract Superposed wavefunctions in quantum mechanics lead to a squared amplitude that introduces interference into a probability density, which has long been a puzzle because interference between probability densities exists nowhere else in probability theory. In recent years, Man’ko and coauthors have successfully reconciled quantum and classic probability using a symplectic tomographic model. Nevertheless, there remains an unexplained coincidence in quantum mechanics, namely, that mathematically, the interference term in the squared amplitude of superposed wavefunctions has the form of a variance of a sum of correlated random variables, and we examine whether there could be an archetypical variable behind quantum probability that provides a mathematical foundation that observes both quantum and classic probability directly. The properties that would need to be satisfied for this to be the case are identified, and a generic hidden variable that satisfies them is found that would be present everywhere, transforming into a process-specific variable wherever a quantum process is active. Uncovering this variable confirms the possibility that it is the stochastic archetype of quantum probability.


2006 ◽  
Vol 84 (6-7) ◽  
pp. 633-638 ◽  
Author(s):  
A A Méthot

The strongest attack against quantum mechanics came in 1935 in the form of a paper by Einstein, Podolsky, and Rosen. It was argued that the theory of quantum mechanics could not be called a complete theory of Nature, for every element of reality is not represented in the formalism as such. The authors then put forth a proposition: we must search for a theory where, upon knowing everything about the system, including possible hidden variables, one could make precise predictions concerning elements of reality. This project was ultimately doomed in 1964 with the work of Bell, who showed that the most general local hidden variable theory could not reproduce correlations that arise in quantum mechanics. There exist mainly three forms of no-go theorems for local hidden variable theories. Although almost every physicist knows the consequences of these no-go theorems, not every physicist is aware of the distinctions between the three or even their exact definitions. Thus, we will discuss here the three principal forms of no-go theorems for local hidden variable theories of Nature. We will define Bell theorems, Bell theorems without inequalities, and pseudo-telepathy. A discussion of the similarities and differences will follow. PACS Nos.: 03.65.–w, 03.65.Ud, 03.65.Ta


Author(s):  
Gregg Jaeger

The origin and basis of the notion of quantum contextuality is identified in the Copenhagen approach to quantum mechanics, where context is automatically invoked by its requirement that the experimental arrangement involved in any measurements or set of measurements be taken into account while, in general, the outcome of a measurement may depend on other measurements immediately preceding or jointly performed on the same system. For Bohr, the specification of the experimental situation of any measurement is essential to its significance in light of complementarity and the omnipresence of the quantum of action in physics; for Heisenberg, the incompatibility of pairs of sharp measurements belonging to different situations coheres with both the completeness of the quantum state as an objective physical description and the principle of indeterminacy. Here, context in the Copenhagen approach is taken to be the equivalence class of experimental arrangements corresponding to a set of compatible measurements of quantum observables in standard quantum mechanics; the associated form of contextuality in quantum mechanics arises via the non-commutativity in general of sharp observables, proven by von Neumann, that can appear, providing different contexts. This notion is related to theoretical situations explored later by Bell, by Kochen and Specker, and by others in relation to the classification of hidden-variables theories and elsewhere in physics. This article is part of the theme issue ‘Contextuality and probability in quantum mechanics and beyond’.


2000 ◽  
Vol 15 (18) ◽  
pp. 2813-2820 ◽  
Author(s):  
ADÁN CABELLO

A recent proposal to experimentally test quantum mechanics against noncontextual hidden-variable theories [Phys. Rev. Lett.80, 1797 (1998)] is shown to be related with the smallest proof of the Kochen–Specker theorem currently known [Phys. Lett.A212, 183 (1996)]. This proof contains eighteen yes-no questions about a four-dimensional physical system, combined in nine mutually incompatible tests. When these tests are considered as tests about a two-part two-state system, then quantum mechanics and noncontextual hidden variables make the same predictions for eight of them, but make different predictions for the ninth. Therefore, this ninth test would allow us to discriminate between quantum mechanics and noncontextual hidden-variable theories in a (gedanken) single run experiment.


2006 ◽  
Vol 20 (3) ◽  
pp. 413-427 ◽  
Author(s):  
Claude Lefèvre

This article is concerned with a loading-dependent model of cascading failure proposed recently by Dobson, Carreras, and Newman [6]. The central problem is to determine the distribution of the total number of initial components that will have finally failed. A new approach based on a closed connection with epidemic modeling is developed. This allows us to consider a more general failure model in which the additional loads caused by successive failures are arbitrarily fixed (instead of being constant as in [6]). The key mathematical tool is provided by the partial joint distributions of order statistics for a sample of independent uniform (0,1) random variables.


Sign in / Sign up

Export Citation Format

Share Document