14.11.2014 Views

Estimating the Codifference Function of Linear Time Series Models ...

Estimating the Codifference Function of Linear Time Series Models ...

Estimating the Codifference Function of Linear Time Series Models ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

where<br />

and<br />

where l = 1, . . .,r<br />

ξ t+j =<br />

(<br />

Re(ln N ∑ −1 N<br />

Z t+j) = ⎜<br />

t=1 ⎝<br />

⎛<br />

Re(ln N ∑ −1 N<br />

t=1 Xj l ) = ⎜<br />

⎝<br />

Re(ln N −1 ∑ N<br />

t=1 Z t+j)<br />

Im(ln N −1 ∑ N<br />

t=1 Z t+j)<br />

⎛<br />

)<br />

Re(ln N −1 ∑ N<br />

Re(ln N −1 ∑ N<br />

t=1 Xj 1 )<br />

t=1 Xj 2 )<br />

.<br />

Re(ln N −1 ∑ N<br />

t=1 Xj r))<br />

⎞<br />

⎟<br />

⎠<br />

Re(ln N −1 ∑ ⎞<br />

N<br />

t=1 exp(−is lX t ))<br />

Re(ln N −1 ∑ N<br />

t=1 exp(is ⎟<br />

lX t+j ))<br />

Re(ln N −1 ∑ ⎠<br />

N<br />

t=1 exp(is l(X t+j − X t )))<br />

(similarly for <strong>the</strong> imaginary part. Note that <strong>the</strong> summation and <strong>the</strong> principal value <strong>of</strong> ln(·) are<br />

defined componentwise), <strong>the</strong>n we have<br />

(<br />

Reln (N<br />

λ<br />

−1 ∑ )<br />

N<br />

t=1 YT t ) [( ) ( )]<br />

Re ˆτ<br />

Imln (N −1 ∑ N = λζt T =<br />

∗ (s, 0) Re ˆτ<br />

Im ˆτ ∗ , . . . ,<br />

∗ (s, h)<br />

(s, 0) Im ˆτ ∗ (s, h)<br />

t=1 YT t )<br />

where λ is as given in (27). We <strong>the</strong>refore need to show that when N → ∞<br />

( (( ) ( )) T<br />

Re τ(0) Re τ(h)<br />

a T (λ[ξ t , ξ t+1 , . . . , ξ t+h ]) T is AN a T , . . . ,<br />

, N<br />

0<br />

0<br />

−1 a Ma)<br />

T<br />

(37)<br />

for all vectors a = (a 0 , . . . , a h ) T ∈ R h+1 such that a T Ma > 0. For any such a, <strong>the</strong> sequence<br />

{a T (λζ T t )T } is (m + h)-dependent and since by Proposition B.1<br />

where M is <strong>the</strong> covariance matrix<br />

lim N<br />

N→∞ var(aT (λ[ξ t , ξ t+1 , . . .,ξ t+h ]) T ) = a T Ma > 0<br />

M = [ λL p 2 V pqL q 2 λT] p,q=0,...,h<br />

and <strong>the</strong> vectors λ,L p 2 ,Lq 2 , matrix V pq are as given in Proposition B.1 above. We can conclude<br />

that {a T (λζ T t ) T } satisfies <strong>the</strong> conditions <strong>of</strong> central limit <strong>the</strong>orem for m-dependent processes (e.g<br />

Brockwell and Davis, 1987, Theorem 6.4.2), and <strong>the</strong>refore by this <strong>the</strong>orem, for N → ∞, we obtain<br />

<strong>the</strong> required result (37). The relation Imτ(s, j) = 0, j = 0, 1, . . .,h can be obtained directly from<br />

identities (29)-(30).<br />

Proposition B.3 Proposition B.2 remains true for X t , t ∈ Z being a stationary linear process<br />

(1), satisfying conditions C1 and C2.<br />

Pro<strong>of</strong>.For <strong>the</strong> pro<strong>of</strong>, we will apply <strong>the</strong> result <strong>of</strong> Proposition B.2 to <strong>the</strong> truncated sequence X tm =<br />

∑ m<br />

j=0 c jǫ t−j and <strong>the</strong>n derive <strong>the</strong> result for X t by letting m → ∞. For 0 ≤ p ≤ h, we define<br />

ˆτ m ∗ (s, −s; p) = − lnφ∗ m (s, −s; p) + lnφ∗ m (s, 0; p) + lnφ∗ m (0, −s; p) (38)<br />

where φ ∗ m(u, v; p) = N −1 ∑ N<br />

t=1 exp(i(uX (t+p)m + vX tm )). Then by Proposition B.2<br />

[( ) ( )]<br />

N 1/2 Re ˆτ<br />

∗<br />

m (s, 0) − Re τ m (s, 0) Re ˆτ<br />

∗<br />

Im ˆτ m(s, ∗ , . . ., m (s, h) − Re τ m (s, h)<br />

0) − Im τ m (s, 0) Im ˆτ m(s, ∗ ⇒ Y<br />

h) − Im τ m (s, h) m<br />

14

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!