18.06.2013 Views

Measuring the Effects of a Shock to Monetary Policy - Humboldt ...

Measuring the Effects of a Shock to Monetary Policy - Humboldt ...

Measuring the Effects of a Shock to Monetary Policy - Humboldt ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

24 Bayesian FAVARs with Agnostic Identification<br />

with high dimensions is that it is very difficult get <strong>the</strong> joint marginal distribution by<br />

integration. As BBE (2005) state, <strong>the</strong> irregular nature <strong>of</strong> <strong>the</strong> likelihood function makes<br />

integration infeasible in practice. Bayesian analysis usually requires integration <strong>to</strong> get<br />

<strong>the</strong> marginal posterior distribution <strong>of</strong> <strong>the</strong> individual parameters from a joint posterior<br />

distribution <strong>of</strong> all unknown parameters <strong>of</strong> <strong>the</strong> model in which statistical practioneers are<br />

interested in. As already stated <strong>the</strong>se integrals in a high dimensional case are very hard<br />

<strong>to</strong> solve. The joint posterior density itself is very difficult <strong>to</strong> derive, from which marginals<br />

are <strong>to</strong> be derived. The MCMC methods such as <strong>the</strong> Gibbs sampler serve <strong>the</strong> solution<br />

<strong>to</strong> such high dimensional problems, in that <strong>the</strong>y allow <strong>to</strong> implement posterior simulation<br />

that allow <strong>the</strong> point wise evaluation <strong>of</strong> prior distribution and likelihood function.<br />

Markov chain refers <strong>to</strong> <strong>the</strong> sequence <strong>of</strong> random variables (Z0, ..., Zn) that are generated<br />

by a Markov process. The MCMC methods attempt <strong>to</strong> simulate direct draws from some<br />

complex distribution <strong>of</strong> interest. Here <strong>the</strong> Gibbs sampling methodology <strong>of</strong>fers an easy way<br />

<strong>to</strong> solve <strong>the</strong> problem, given that conditional posterior distributions are readily available.<br />

It may be employed <strong>to</strong> obtain easily marginals <strong>of</strong> <strong>the</strong> parameters without integration and<br />

without having <strong>to</strong> know <strong>the</strong> joint density.<br />

4.3.5 The Gibbs Sampler<br />

The Gibbs sampling methodology <strong>of</strong>fers an easy way <strong>to</strong> <strong>to</strong> solve <strong>the</strong> dimensionality prob-<br />

lem given that conditional posterior distribution are readily available. In order <strong>to</strong> exem-<br />

plify <strong>the</strong> Gibbs sampler, think <strong>of</strong> θ as <strong>the</strong> parameter space. Fur<strong>the</strong>rmore assume that<br />

p(θ | YT ) represents <strong>the</strong> joint probability density function where YT denotes <strong>the</strong> data.<br />

Following a cyclical iterative pattern, <strong>the</strong> Gibbs sampler generates <strong>the</strong> joint distribution<br />

<strong>of</strong> p(θ | YT ) which can be also referred <strong>to</strong> as <strong>the</strong> target distribution that one tries <strong>to</strong><br />

approximate empirically via a Markov chain. The Gibbs sampler begins with a parti-<br />

tioning or blocking <strong>of</strong> <strong>the</strong> parameters in d subvec<strong>to</strong>rs θ ′ = (θ1, . . . , θd). In practice <strong>the</strong><br />

blocking is chosen so that it is feasible <strong>to</strong> draw from each <strong>of</strong> <strong>the</strong> conditional pdf’s so

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!