27.03.2013 Views

Bayesian Dynamic Factor Models - Department of Statistical Science ...

Bayesian Dynamic Factor Models - Department of Statistical Science ...

Bayesian Dynamic Factor Models - Department of Statistical Science ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Factor</strong> Model (LTDFM). Throughout we denote the hyper-parameters <strong>of</strong> the latent VAR(1) models<br />

in the LTFLM by Θ = {µ γ, Φγ, V γ} γ=(c,β,h,λ).<br />

3.2 Priors and posterior computation<br />

<strong>Bayesian</strong> estimation <strong>of</strong> standard dynamic factor models using Markov chain Monte Carlo (MCMC)<br />

methods are described in Aguilar and West (2000), and Lopes and West (2004). Based on their<br />

works, we develop a MCMC algorithm for the LTDFM, exploring the full joint posterior distribution<br />

p(β 1:T , f 1:T , c1:T , h1:T , λ1:T , Θ, Ψ, d|y 1:T ) under certain priors distributions.<br />

First, assuming traditional priors for the VAR parameters in Θ simplifies the conditional pos-<br />

terior distribution <strong>of</strong> Θ given the other parameters and data; we obtain the set <strong>of</strong> conditionally<br />

independent posteriors, π(θγi|γ 1:T ), where γ ∈ {c, β, h, λ}, θγi = (µγi, φγi, v 2 γi ), and v2 γi denotes<br />

the i-th diagonal in V γ. Similarly, the conditional posterior distribution <strong>of</strong> ψi reduces to the con-<br />

ditionally independent posterior, π(ψi|{fit}t=1:T ). To generate the sample <strong>of</strong> these parameters,<br />

we can use standard Metropolis-Hastings (MH) methods <strong>of</strong> sampling parameters in univariate AR<br />

models.<br />

Second, sampling the factors f t can be performed by a forward-filtering, backward sampling<br />

(FFBS) algorithm (see e.g., Prado and West (2010)) to obtain the sample from the joint conditional<br />

posterior distribution <strong>of</strong> the full sequence f 1:T , which reduces a possible high autocorrelations in<br />

the MCMC draws. When we assume the standard independent factors by ψj ≡ 0 as in the model (2),<br />

the conditional posterior distribution <strong>of</strong> f t reduces to a simple multivariate normal distribution.<br />

Third, the conditional independence structure <strong>of</strong> the stochastic volatility, (h1:T , λ1:T ), allows<br />

us to use the standard MCMC technique <strong>of</strong> the traditional univariate stochastic volatility models<br />

(Kim et al. (1998), Omori et al. (2007). Shephard and Pitt (1997), Watanabe and Omori (2004)).<br />

Again, these works provide the efficient algorithm to obtain the sample from the joint conditional<br />

posterior distribution <strong>of</strong> {hit}t=1:T , and {λit}t=1:T (Prado and West (2010)).<br />

Fourth, with regard to sampling β 1:T , Nakajima and West (2010) suggest a Metropolis-within-<br />

Gibbs sampling strategy. To target on the conditional distribution <strong>of</strong> β t given β −t = β 1:T \β t and<br />

other parameters, a proposal draw can be straightforwardly generated based on the non-threshold<br />

model by fixing (s1t, . . . , spt) = (1, . . . , 1). The standard MH algorithm is completed by accept-<br />

ing the candidate with the MH probability (See Section 2.3 <strong>of</strong> Nakajima and West (2010)). This<br />

generation is implemented sequentially for t = 1 : T .<br />

Finally, following Nakajima and West (2010), we assume the prior for the latent thresholds by<br />

di ∼ U(0, |µi| + Kivi), i = 1, . . . , p,<br />

where (µi, v2 i ) are the mean and variance <strong>of</strong> the stationary distribution <strong>of</strong> the univariate AR(1)<br />

6

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!