06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

432 5 Unbiased Point Estimation<br />

You are to show this in Exercise 5.10. Compare this with the MLE <strong>of</strong> σ 2 in<br />

Example 6.27 in Chapter 6.<br />

The model in equation (5.88) is called the one-way AOV model. If the αi<br />

in this model are assumed to be constants, it is called a “fixed-effects model”.<br />

A fixed-effects model is also sometimes called “model I”. Now let’s consider a<br />

variant <strong>of</strong> this called a “random-effects model” or “model II”, because the αi<br />

in this model are assumed to be iid random variables.<br />

Example 5.31 UMVUEs <strong>of</strong> the variances in the one-way randomeffects<br />

AOV model<br />

Consider the linear model<br />

Yij = µ + δi + ɛij, i = 1, . . ., m; j = 1, . . ., n, (5.97)<br />

where the δi are identically distributed with E(δi) = 0, V(δi) = σ2 δ , and<br />

Cov(δi, δĩ ) = 0 for i = ĩ, and the ɛij are independent <strong>of</strong> the δi and are<br />

identically distributed with with E(ɛij) = 0, V(ɛij) = σ2 ɛ, and Cov(ɛij, ɛĩ˜j ) = 0<br />

for either i = ĩ or j = ˜j.<br />

An important difference in the random-effects model and the fixed-effects<br />

model is that in the random-effects model, we do not have independence <strong>of</strong><br />

the observables. We have<br />

⎧<br />

Cov(Yij, Yĩ˜j ) =<br />

σ 2 δ + σ2 ɛ for i = ĩ, j = ˜j,<br />

⎨<br />

σ<br />

⎩<br />

2 δ for i = ĩ, j = ˜j,<br />

0 for i = ĩ.<br />

(5.98)<br />

A model such as this may be appropriate when there are a large number<br />

<strong>of</strong> possible treatments and m <strong>of</strong> them are chosen randomly and applied to<br />

experimental units whose responses Yij are observed. While in the fixed-effects<br />

model (5.88), we are interested in whether α1 = · · · = αm = 0, in the random-<br />

effects model, we are interested in whether σ2 δ = 0, which would result in a<br />

similar practical decision about the treatments.<br />

In the model (5.97) the variance <strong>of</strong> each Yij is σ2 δ +σ2 ɛ, and our interest in<br />

using the model is to make inference on the relative sizes <strong>of</strong> the components <strong>of</strong><br />

the variance σ2 δ and σ2 ɛ. The model is sometimes called a “variance components<br />

model”.<br />

Let us suppose now that δi iid ∼ N(0, σ2 δ ), where σ2 δ ≥ 0, and ɛij iid ∼ N(0, σ2 ɛ),<br />

where as usual σ2 > 0. This will allow us to determine exact sampling distributions<br />

<strong>of</strong> the relevant statistics.<br />

We transform the model using Helmert matrices Hm and Hn as in equation<br />

(5.82).<br />

Let<br />

⎡ ⎤ ⎡ ⎤ ⎡ ⎤<br />

Y11 · · · Y1n δ1<br />

ɛ11 · · · ɛ1n<br />

⎢<br />

Y =<br />

. . ⎥ ⎢<br />

⎣ . . ⎦ ; δ =<br />

. ⎥ ⎢<br />

⎣ . ⎦; and ɛ =<br />

. . ⎥<br />

⎣ . . ⎦.<br />

Ym1 · · · Ymn<br />

δm<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle<br />

ɛm1 · · · ɛmn

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!