29.08.2020 Views

Shumway Stoffer Time_Series_Analysis_and_Its_Applications__With_R_Examples 3rd edition (1)

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

20 1 Characteristics of Time Series

Example 1.17 Autocovariance of a Moving Average

Consider applying a three-point moving average to the white noise series w t

of the previous example as in Example 1.9. In this case,

γ v (s, t) = cov(v s , v t ) = cov { 1

3 (w s−1 + w s + w s+1 ) , 1 3 (w t−1 + w t + w t+1 ) } .

When s = t we have 3

γ v (t, t) = 1 9 cov{(w t−1 + w t + w t+1 ), (w t−1 + w t + w t+1 )}

When s = t + 1,

= 1 9 [cov(w t−1, w t−1 ) + cov(w t , w t ) + cov(w t+1 , w t+1 )]

= 3 9 σ2 w.

γ v (t + 1, t) = 1 9 cov{(w t + w t+1 + w t+2 ), (w t−1 + w t + w t+1 )}

= 1 9 [cov(w t, w t ) + cov(w t+1 , w t+1 )]

= 2 9 σ2 w,

using (1.12). Similar computations give γ v (t − 1, t) = 2σw/9, 2 γ v (t + 2, t) =

γ v (t − 2, t) = σw/9, 2 and 0 when |t − s| > 2. We summarize the values for all

s and t as

3

9

⎪⎨

σ2 w s = t,

2

9

γ v (s, t) =

σ2 w |s − t| = 1,

(1.13)

1

9

⎪⎩

σ2 w |s − t| = 2,

0 |s − t| > 2.

Example 1.17 shows clearly that the smoothing operation introduces a

covariance function that decreases as the separation between the two time

points increases and disappears completely when the time points are separated

by three or more time points. This particular autocovariance is interesting

because it only depends on the time separation or lag and not on the absolute

location of the points along the series. We shall see later that this dependence

suggests a mathematical model for the concept of weak stationarity.

Example 1.18 Autocovariance of a Random Walk

For the random walk model, x t = ∑ t

j=1 w j, we have

s∑ t∑

γ x (s, t) = cov(x s , x t ) = cov ⎝ w j , w k

⎠ = min{s, t} σw,

2

j=1

because the w t are uncorrelated random variables. Note that, as opposed

to the previous examples, the autocovariance function of a random walk

3 If the random variables U = ∑ m

j=1 ajXj and V = ∑ r

k=1 b kY k are linear combinations

of random variables {X j} and {Y k }, respectively, then cov(U, V ) =

m

∑ r

j=1 k=1 ajb kcov(X j, Y k ). Furthermore, var(U) = cov(U, U).

k=1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!