statistical syllogism
Recently Published Documents


TOTAL DOCUMENTS

4
(FIVE YEARS 0)

H-INDEX

1
(FIVE YEARS 0)

Author(s):  
John L. Pollock

I have urged that nomic probability be analyzed in terms of its conceptual role. The conceptual role analysis of nomic probability has four parts: (1) an account of statistical induction; (2) an account of the computational principles that allow some nomic probabilities to be derived from others; (3) an account of acceptance rules; and (4) an account of direct inference. The purpose of the present chapter is to develop and defend the acceptance rules that will play a central role in the theory of nomic probability. The theories of direct inference and statistical induction will then be derived from the acceptance rules and the computational principles defended in the last chapter. Although some of the computational principles are novel, they still amount to little more than an embellishment of the classical probability calculus. The main philosophical weight of the theory of nomic probability will be borne by the acceptance rules. A simple acceptance rule will be described and defended in section 2. The epistemological framework presupposed by the rule will be discussed and refined in section 3. Sections 4 and 5 will demonstrate that more powerful rules can be derived from the simple acceptance rule described in section 2. The philosophical literature contains numerous proposals for probabilistic acceptance rules. For instance, the following “Simple Rule” has had a number of proponents: . . . Belief in P is justified iff P is probable. . . . Note, however, that this rule is formulated in terms of definite probabilities. This is true of most candidate acceptance rules. However, nomic probability is an indefinite probability. It would make no sense to propose a rule like the Simple Rule for nomic probability. Nevertheless, there is an obvious candidate for an acceptance rule formulated in terms of nomic probability. This is the Statistical Syllogism, whose traditional formulation is something like the following: . . . Most A’s are B’s. This is an A./ Therefore, this is a E. . . . It seems clear that we often reason in roughly this way. For instance, on what basis do I believe what I read in the newspaper?


1975 ◽  
Vol 14 (02) ◽  
pp. 76-80 ◽  
Author(s):  
K. Kayser

Among the diagnostic algorithms, the statistic syllogism, the Westmeyer model and the confirmative model are discussed. Three methods to eliminate the logical inconsistencies of the statistical syllogism are shown: 1) Substitution of the inductive probability for the objective probability; 2) Replacement of the unconditional probability by the probability limited to the specific antecedents; 3) Proposing a certain distribution of diagnostic rules.Particularly 2) and 3) are of significance in the construction of a medical thesaurus and point to the separation of identification numbers and search structures. The Westmeyer model can be used to calculate the maximum possible number of diagnoses. The ›static‹ confirmative model can be modified into a dynamic one. This dynamic model allows conclusions regarding. the time-dependence of diagnoses and examination methods.


1972 ◽  
Vol 1 (2) ◽  
pp. 118-138
Author(s):  
Hans J. Hummell

Abstract The structure of some kinds of arguments typically to be found in theoretical-empirical social science is examined with special attention to the rules of deduction which in the majority of cases remain implicit. Four types of propositions - describing deterministic relations between attributes (statements of the form ‘if x, then y’), describing monotonous deterministic relations between serials (‘the more x, the more y’) and their probabilistic counterparts - as well as two classes of rules of deduction (based on the properties of transitivity and conjunctivity of corresponding statement forms) are treated. Of the propositions only those of the deterministic if-then kind are unproblematic. The thesis is advanced that most of those ‘arguments‘ presented in social science consisting of sentences of relatively complicated structures - using rules, however, which are exclusively valid for deterministic if-then-statements - are not correct. Deterministic propositions being discussed in Part One, Part Two treats probabilistic ones with the following results: (I) By considering the ‘statistical syllogism’ the non-conjunctivity of probabilistic implications is shown; (II) their non-transitivity follows from the fact that in complex causal structures direct and indirect effects may vary independently of each other. For the simplest case of three attributes a qualitative analogon to the Simon-Blalock-Procedure is constructed and illustrated by an example from mobility research. (Ill) If monotonous probabilistic relations are characterized by coefficients of linear correlation, in the general case no deductions are possible which would use a rule of transitivity. As a general solution to the problem of deducing valid conclusions it is suggested that the verbal language - which, though enriched by a technical vernacular, with respect to its formal apparatus is entirely based on elementary logics - should be replaced by languages of richer logical structures; this, however, means formalization. According to this function to warrant valid deductions the concept of formalization is specified and delimited from similar concepts as the construction of propositions within a linguistic framework of relatively rich logical structure which is given as a special calculus.


1953 ◽  
Vol 50 (26) ◽  
pp. 805
Author(s):  
James Willard Oliver

Sign in / Sign up

Export Citation Format

Share Document