17.08.2013 Views

THÈSE Estimation, validation et identification des modèles ARMA ...

THÈSE Estimation, validation et identification des modèles ARMA ...

THÈSE Estimation, validation et identification des modèles ARMA ...

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chapitre 4. Multivariate portmanteau test for weak structural V<strong>ARMA</strong> models 122<br />

sentation Aθ0(L)Xt = Bθ0(L)ǫt can be rewritten as the reduced V<strong>ARMA</strong> representation<br />

Xt −<br />

p<br />

i=1<br />

A −1<br />

00A0iXt−i = <strong>et</strong> −<br />

q<br />

i=1<br />

A −1<br />

00B0iB −1<br />

00 A00<strong>et</strong>−i. (4.3)<br />

Note that <strong>et</strong>(θ0) = <strong>et</strong>. For simplicity, we will omit the notation θ in all quantities<br />

taken at the true value, θ0. For all θ ∈ Θ, the assumption on the MA polynomial<br />

(from<br />

<br />

A3) implies that there exists a sequence of constants matrices (Ci(θ)) such that<br />

∞<br />

i=1Ci(θ) < ∞ and<br />

∞<br />

<strong>et</strong>(θ) = Xt − Ci(θ)Xt−i. (4.4)<br />

Given a realization X1,X2,...,Xn, the variable <strong>et</strong>(θ) can be approximated, for 0 < t ≤<br />

n, by ˜<strong>et</strong>(θ) defined recursively by<br />

˜<strong>et</strong>(θ) = Xt −<br />

p<br />

i=1<br />

i=1<br />

A −1<br />

0 AiXt−i +<br />

q<br />

i=1<br />

A −1 −1<br />

0 BiB0 A0˜<strong>et</strong>−i(θ),<br />

where the unknown initial values are s<strong>et</strong> to zero : ˜e0(θ) = ··· = ˜e1−q(θ) = X0 = ··· =<br />

X1−p = 0. The gaussian quasi-likelihood is given by<br />

n 1<br />

Ln(θ,Σe) =<br />

(2π) d/2√ <br />

exp −<br />

d<strong>et</strong>Σe<br />

1<br />

2 ˜e′ t (θ)Σ−1 e ˜<strong>et</strong>(θ)<br />

<br />

, Σe = A −1<br />

0 B0ΣB ′ 0A−1′ 0 .<br />

t=1<br />

A quasi-maximum likelihood (QML) of θ and Σe are a measurable solution ( ˆ θn, ˆ Σe) of<br />

( ˆ θn, ˆ <br />

Σe) = argmin log(d<strong>et</strong>Σe)+<br />

θ,Σe<br />

1<br />

n<br />

˜<strong>et</strong>(θ)Σ<br />

n<br />

−1<br />

e ˜e ′ <br />

t(θ) .<br />

Under the following additional assumptions, Boubacar Mainassara and Francq (2009)<br />

showed respectively in Theorem 1 and Theorem 2 the consistency and the asymptotic<br />

normality of the QML estimator of weak multivariate <strong>ARMA</strong> model.<br />

Assume that θ0 is not on the boundary of the param<strong>et</strong>er space Θ.<br />

A6 : We have θ0 ∈ ◦<br />

Θ, where ◦<br />

Θ denotes the interior of Θ.<br />

We denote by αǫ(k), k = 0,1,..., the strong mixing coefficients of the process (ǫt). The<br />

mixing coefficients of a stationary process ǫ = (ǫt) are denoted by<br />

αǫ(k) = sup<br />

A∈σ(ǫu,u≤t),B∈σ(ǫu,u≥t+h)<br />

|P(A∩B)−P(A)P(B)|.<br />

The reader is referred to Davidson (1994) for d<strong>et</strong>ails about mixing assumptions.<br />

A7 : We have Eǫt 4+2ν < ∞ and ∞<br />

k=0<br />

t=1<br />

{αǫ(k)} ν<br />

2+ν < ∞ for some ν > 0.<br />

One of the most popular estimation procedure is that of the least squares estimator<br />

(LSE) minimizing<br />

logd<strong>et</strong> ˆ <br />

n 1<br />

Σe = logd<strong>et</strong> ˜<strong>et</strong>(<br />

n<br />

ˆ θ)˜e ′ t (ˆ <br />

θ) ,<br />

t=1

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!