04.01.2013 Views

Springer - Read

Springer - Read

Springer - Read

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

250 Chapter 7 Multivariate Time Series<br />

7.6.2 Forecasting Multivariate Autoregressive Processes<br />

The technique developed in Section 7.5 allows us to compute the minimum mean<br />

squared error one-step linear predictors ˆXn+1 for any multivariate stationary time<br />

series from the mean µ and autocovariance matrices Ɣ(h) by recursively determining<br />

the coefficients ni,i 1,...,n, and evaluating<br />

ˆXn+1 µ + n1(Xn − µ) +···+nn(X1 − µ). (7.6.9)<br />

The situation is simplified when {Xt} is the causal AR(p) process defined by<br />

(7.6.4), since for n ≥ p (as is almost always the case in practice)<br />

ˆXn+1 1Xn +···+pXn+1−p. (7.6.10)<br />

To verify (7.6.10) it suffices to observe that the right-hand side has the required form<br />

(7.5.2) and that the prediction error<br />

Xn+1 − 1Xn −···−pXn+1−p Zn+1<br />

is orthogonal to X1,...,Xn in the sense of (7.5.3). (In fact, the prediction error is<br />

orthogonal to all Xj, −∞

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!