04.01.2013 Views

Springer - Read

Springer - Read

Springer - Read

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

274 Chapter 8 State-Space Models<br />

With (8.1.2) and (8.4.4) this gives<br />

t+1 FtE(XtX ′ ′<br />

t )F t + Qt − FtE<br />

FttF ′<br />

t + Qt − t −1<br />

t ′<br />

t .<br />

ˆXt<br />

<br />

ˆX<br />

′<br />

t F ′ −1<br />

t − tt ′<br />

t<br />

h-step Prediction of {Yt} Using the Kalman Recursions<br />

The Kalman prediction equations lead to a very simple algorithm for recursive calculation<br />

of the best linear mean square predictors PtYt+h, h 1, 2,.... From (8.4.4),<br />

(8.1.1), (8.1.2), and Remark 3 in Section 8.1, we find that<br />

and<br />

PtXt+1 FtPt−1Xt + t −1<br />

t (Yt − Pt−1Yt), (8.4.5)<br />

PtXt+h Ft+h−1PtXt+h−1<br />

.<br />

(Ft+h−1Ft+h−2 ···Ft+1) PtXt+1, h 2, 3,..., (8.4.6)<br />

PtYt+h Gt+hPtXt+h, h 1, 2,.... (8.4.7)<br />

From the relation<br />

we find that (h)<br />

t<br />

Xt+h − PtXt+h Ft+h−1(Xt+h−1 − PtXt+h−1) + Vt+h−1, h 2, 3,...,<br />

(h)<br />

t<br />

: E[(Xt+h − PtXt+h)(Xt+h − PtXt+h) ′ ] satisfies the recursions<br />

(h−1)<br />

Ft+h−1<br />

t<br />

F ′<br />

t+h−1 + Qt+h−1, h 2, 3,..., (8.4.8)<br />

with (1)<br />

t t+1. Then from (8.1.1) and (8.4.7), (h)<br />

t : E[(Yt+h − PtYt+h)(Yt+h −<br />

PtYt+h) ′ ]isgivenby<br />

(h)<br />

t<br />

(h)<br />

Gt+h<br />

t G′<br />

t+h + Rt+h, h 1, 2,.... (8.4.9)<br />

Example 8.4.1 Consider the random walk plus noise model of Example 8.2.1 defined by<br />

Yt Xt + Wt, {Wt} ∼WN 0,σ 2<br />

w ,<br />

where the local level Xt follows the random walk<br />

Xt+1 Xt + Vt, {Vt} ∼WN 0,σ 2<br />

v .<br />

Applying the Kalman prediction equations with Y0 : 1, R σ 2 w , and Q σ 2 v ,we<br />

obtain<br />

ˆYt+1 PtYt+1 ˆXt + t<br />

<br />

Yt − ˆYt<br />

t<br />

(1 − at) ˆYt + atYt

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!