15.01.2013 Views

Bayesian Methods for Astrophysics and Particle Physics

Bayesian Methods for Astrophysics and Particle Physics

Bayesian Methods for Astrophysics and Particle Physics

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

1.6 Numerical <strong>Methods</strong><br />

1.6 Numerical <strong>Methods</strong><br />

The calculation of the posterior probability <strong>and</strong> the evaluation of the <strong>Bayesian</strong><br />

evidence is generally not a simple task. Priors <strong>and</strong> posteriors are often complex<br />

distributions which may not be easily expressed in analytical <strong>for</strong>mulae. The<br />

simplest approach is to evaluate the posterior probability density on a discrete<br />

parameter grid but this approach become impractical even <strong>for</strong> problems with<br />

very few ( 5) dimensions as the number of posterior probability evaluations<br />

required to produce a fine grid grows exponentially with the dimensionality of<br />

the problem. In the rest of this section, we describe the workhorse of <strong>Bayesian</strong><br />

modelling, the Metropolis–Hastings algorithm based on Markov Chain Monte<br />

Carlo (MCMC) sampling <strong>and</strong> the st<strong>and</strong>ard method of evaluating the <strong>Bayesian</strong><br />

evidence, “Thermodynamic Integration”.<br />

1.6.1 Markov Chain Monte Carlo (MCMC) Sampling<br />

As discussed in the previous section, the calculation of posterior probability <strong>and</strong><br />

the <strong>Bayesian</strong> evidence present a particularly challenging problems even <strong>for</strong> mod-<br />

erate dimensional parameter spaces. Rather than evaluating the posterior prob-<br />

ability density on a fine grid, a much more efficient approach is to draw Monte<br />

Carlo samples from the posterior probability distribution such that the number<br />

density of these Monte Carlo samples is equal to the posterior probability density.<br />

Once enough posterior samples have been drawn, a smoothed histogram of these<br />

samples would be a good representation of the posterior density. Another ad-<br />

vantage of the Monte Carlo sampling of the posterior is that the marginalization<br />

is straight<strong>for</strong>ward: simply disregard the “uninteresting” parameters <strong>and</strong> make a<br />

histogram of the “interesting” parameter sample values.<br />

The Metropolis–Hastings ((Metropolis et al. 1953) provides an elegant solution<br />

to the problem of drawing Monte Carlo samples from the posterior distribution.<br />

Metropolis–Hastings algorithm works by creating a Markov Chain: a series of<br />

r<strong>and</strong>om samples (points in the parameter space) whose values are determined<br />

only by the values of previous points in the series. The probability distribution<br />

11

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!