Logical Conditioning and Entropy Inference Based on Observational Data
The logical conditioning inference problem is studied when a simple condition setting a threshold for a potential observation of a scalar observable uncertain quantity is introduced as an additional information to the pieces of evidence originally available to make inference, which include observational data. Two alternative ways of incorporating this condition can be considered. They lead to two conditioned evidences from which it is possible to start and, consequently, two different inference problems arise. Due to the need to coherently represent observational data and to express unambiguously the given evidences, these problems are formalized in a plausible logic language with observational data, within the logical probability framework. They are solved by applying the relative entropy method with fractile constraints. A comparison of the solutions obtained indicates that one of them is a particular instance of the other. It is concluded that the broadest one constitutes the general solution to the logical conditioning inference problem.