06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Markov Chain Monte Carlo<br />

4.7 Computational Methods 373<br />

Monte Carlo techniques <strong>of</strong>ten allow us to make statistical inferences when the<br />

statistical method involves intractable expressions. In applications in Bayesian<br />

inference, we can study the posterior distribution, which may be intractable or<br />

which may be known only proportionally, by studying random samples from<br />

that distribution.<br />

In parametric Bayesian inference, the objective is to obtain the conditional<br />

posterior distribution <strong>of</strong> the parameter, given the observed data. This is QH<br />

in equation (4.2), and it is defined by the density in step 5 in the procedure<br />

outlined in Section 4.2.3. This density contains all <strong>of</strong> the information about<br />

the parameter <strong>of</strong> interest, although we may wish to use it for specific types <strong>of</strong><br />

inference about the parameter, such as a point estimator or a credible set.<br />

Understanding the Posterior Distribution<br />

As with any probability distribution, a good way to understand the posterior<br />

distribution is to take a random sample from it. In the case <strong>of</strong> the posterior<br />

distribution, we cannot take a physical random sample. We can, however,<br />

simulate a random sample, using methods discussed in Section 0.0.7, beginning<br />

on page 657.<br />

In single-parameter cases, random samples from the posterior distribution<br />

can <strong>of</strong>ten be generated using a direct acceptance/rejection method if the<br />

constant <strong>of</strong> proportionality is known. If the posterior density is known only<br />

proportionally, a Metropolis-Hastings method <strong>of</strong>ten can be used.<br />

Often the posterior density is a fairly complicated function, especially in<br />

multi-parameter cases or in hierarchical models. In such cases, we may be able<br />

to express the conditional density <strong>of</strong> each parameter given all <strong>of</strong> the other<br />

parameters. In this case, it is fairly straightforward to use a Gibbs sampling<br />

method to generate samples from the multivariate distribution. Consider the<br />

relatively simple case in Example 4.5. The joint posterior PDF is given in<br />

equation (4.32). We can get a better picture <strong>of</strong> this distribution by simulating<br />

random observations from it. To do this we generate a realization σ 2 from<br />

the marginal posterior with PDF given in equation (4.33), and then with that<br />

value <strong>of</strong> σ 2 , we generate a realization µ from the conditional posterior with<br />

PDF given in equation (4.34).<br />

Example 4.19 illustrates this technique for a hierarchical model.<br />

The simulated random samples from the posterior distribution gives us a<br />

picture <strong>of</strong> the density. It is <strong>of</strong>ten useful to make pair-wise scatter plots <strong>of</strong> the<br />

samples or estimated contour plots <strong>of</strong> the density based on the samples.<br />

Simulated random samples can be used to approximate expectations <strong>of</strong><br />

functions <strong>of</strong> the random parameters with respect to the posterior density<br />

(this is Monte Carlo quadrature), and they can also be used to identify other<br />

properties <strong>of</strong> the posterior distribution, such as its mode.<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!