jeffrey conditioning
Recently Published Documents


TOTAL DOCUMENTS

6
(FIVE YEARS 0)

H-INDEX

3
(FIVE YEARS 0)

2018 ◽  
Vol 85 (1) ◽  
pp. 21-39 ◽  
Author(s):  
Denis Bonnay ◽  
Mikaël Cozic

Author(s):  
Sandy Zabell

The history of the use of symmetry arguments in probability theory is traced. After a brief consideration of why these did not occur in ancient Greece, the use of symmetry in probability, starting in the 17th century, is considered. Some of the contributions of Bernoulli, Bayes, Laplace, W. E. Johnson, and Bruno de Finetti are described. One important thread here is the progressive move from using symmetry to identify a single, unique probability function to using it instead to narrow the possibilities to a family of candidate functions via the qualitative concept of exchangeability. A number of modern developments are then discussed: partial exchangeability, the sampling of species problem, and Jeffrey conditioning. Finally, the use or misuse of seemingly innocent symmetry assumptions is illustrated, using a number of apparent paradoxes that have been widely discussed.


2015 ◽  
Vol 8 (4) ◽  
pp. 611-648 ◽  
Author(s):  
SIMON M. HUTTEGGER

AbstractWe explore the question of whether sustained rational disagreement is possible from a broadly Bayesian perspective. The setting is one where agents update on the same information, with special consideration being given to the case of uncertain information. The classical merging of opinions theorem of Blackwell and Dubins shows when updated beliefs come and stay closer for Bayesian conditioning. We extend this result to a type of Jeffrey conditioning where agents update on evidence that is uncertain but solid (hard Jeffrey shifts). However, merging of beliefs does not generally hold for Jeffrey conditioning on evidence that is fluid (soft Jeffrey shifts, Field shifts). Several theorems on the asymptotic behavior of subjective probabilities are proven. Taken together they show that while a consensus nearly always emerges in important special cases, sustained rational disagreement can be expected in many other situations.


2009 ◽  
Vol 18 (2) ◽  
pp. 336-345 ◽  
Author(s):  
C. Wagner
Keyword(s):  

2003 ◽  
Vol 19 ◽  
pp. 243-278 ◽  
Author(s):  
P. D. Grunwald ◽  
J. Y. Halpern

As examples such as the Monty Hall puzzle show, applying conditioning to update a probability distribution on a ``naive space'', which does not take into account the protocol used, can often lead to counterintuitive results. Here we examine why. A criterion known as CAR (``coarsening at random'') in the statistical literature characterizes when ``naive'' conditioning in a naive space works. We show that the CAR condition holds rather infrequently, and we provide a procedural characterization of it, by giving a randomized algorithm that generates all and only distributions for which CAR holds. This substantially extends previous characterizations of CAR. We also consider more generalized notions of update such as Jeffrey conditioning and minimizing relative entropy (MRE). We give a generalization of the CAR condition that characterizes when Jeffrey conditioning leads to appropriate answers, and show that there exist some very simple settings in which MRE essentially never gives the right results. This generalizes and interconnects previous results obtained in the literature on CAR and MRE.


Sign in / Sign up

Export Citation Format

Share Document