11.07.2015 Views

2DkcTXceO

2DkcTXceO

2DkcTXceO

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

A.P. Dempster 273when the X i are assumed known and inferences about P are sought. The justificationfor the less familiar “inverse” application is tied to the fundamentalDS “rule of combination” under “independence” that was the pivotal innovationof my earliest papers. The attitude here is that the X i and P describefeatures of the real world, with values that may be known or unknown accordingto context, thereby avoiding the criticism that P and X i must be viewedasymmetrically because P is “fixed” and X i is “random.” Under a personalistviewpoint the argument based on asymmetry is not germane. Either or bothof the two independence assumptions may be assumed according as P or X ior both have known values. (In the case of “both,” the U i become partiallyknown according to the above formulas.)Precise details of the concepts and operations of the “extended calculus ofprobability” (ECP) arise naturally when the X i are fixed, whence the aboverelations do not determine P uniquely, but instead limit P to an interval associatedwith personal probabilities determined by the U i . Under the ECPin its general form, personal probabilities are constructed from a distributionover subsets of the SSS that we call a “mass distribution.” The mass distributiondetermines by simple sums the (p, q, r) foranydesiredsubsetoftheSSS,according as mass is restricted to the subset in the case of p, or is restrictedto the complement of the subset in the case of q, or has positive accumulatedmass in both subsets in the case of r. DScombinationofindependentcomponentmass distributions involves both intersection of subsets as in propositionallogic, and multiplication of probabilities as in the ordinary calculus ofprobability (OCP). The ECP allows not only projecting a mass distribution“down” to margins, as in the OCP, but also inverse projection of a marginalmass distribution “up” to a finer margin of the SSS, or to an SSS that hasbeen expanded by adjoining arbitrarily many new variables. DS combinationtakes place in principle across input components of evidence that have beenprojected up to a full SSS, although in practice computational shortcuts areoften available. Combined inferences are then computed by projecting downto obtain marginal inferences of practical interest.Returning to the example called the “fundamental problem of practicalstatistics,” it can be shown that the result of operating with the inputs of datadeterminedlogical mass distributions, together with inputs of probabilisticmass distributions based on the U i ,leadstoaposteriormassdistributionthat in effect places the unknown P on the interval between the Rth and(R + 1)st ordered values of the U i ,whereR denotes the observed number of“heads” in the n observed trials. This probabilistic interval is the basis forsignificance testing, estimation and prediction.To test the null hypothesis that P = .25, for example, the user computesthe probability p that the probabilistic interval for P is either completely tothe right or left of P = .25. The complementary 1 − p is the probability rthat the interval covers P = .25, because there is zero probability that theinterval shrinks to precisely P .Thus(p, 0,r)isthetriplecorrespondingtotheassertion that the null hypothesis fails, and r =1−p replaces the controversial

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!