11.07.2015 Views

2DkcTXceO

2DkcTXceO

2DkcTXceO

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

278 DS perspective on statistical inferencelimiting case of the binomial P . I have given eight or so seminars and lecturesin recent years, more in Europe than in North America, and largely outsideof the field of statistics proper. I have conducted useful correspondences withstatisticians, most notably Paul Edlefsen, Chuanhai Liu, and Jonty Rougier,for which I am grateful.Near the end of my 1966 paper I sketched an argument to the effect thata limiting case of my basic multinomial model for data with k exchangeablecells, wherein the observables become continuous in the limit, leads to definedlimiting DS inferences concerning the parameters of widely adopted parametricsampling models. This theory deserves to pass from conjecture to theorem,because it bears on how specific DS methods based on sampling models generalizetraditional methods based on likelihood functions that have been studiedby mathematical statisticians for most of the 20th century. Whereas individuallikelihood functions from each of the n data points in a random sample multiplyto yield the combined likelihood function for a complete random sample ofsize n, the generalizing DS mass distributions from single observations combineunder the DS combination rule to provide the mass distribution for DSinference under the full sample. In fact, values of likelihood functions are seento be identical to “upper probabilities” p + r obtained from DS (p, q, r) inferencesfor singleton subsets of the parameters. Ordinary likelihood functionsare thus seen to provide only part of the information in the data. What theylack is information associated with the “don’t know” feature of the extendedcalculus of probability.Detailed connections between the DS system and its predecessors inviteilluminating research studies. For example, mass distributions that generalizelikelihood functions from random samples combine under the DS rule ofcombination with prior mass distributions to provide DS inferences that generalizetraditional Bayesian inferences. The use of the term “generalize” refersspecifically to the recognition that when the DS prior specializes to an ordinarycalculus of probability (OCP) prior, the DS combination rule reproducesthe traditional “posterior = prior × likelihood” rule of traditional Bayesianinference. In an important sense, the DS rule of combination is thus seen togeneralize the OCP axiom that the joint probability of two events equals themarginal probability of one multiplied by the conditional probability of theother given the first.I believe that an elegant theory of parametric inference is out there justwaiting to be explained in precise mathematical terms, with the potential torender moot many of the confusions and controversies of the 20th century overstatistical inference.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!