28.01.2015 Views

PDF of Lecture Notes - School of Mathematical Sciences

PDF of Lecture Notes - School of Mathematical Sciences

PDF of Lecture Notes - School of Mathematical Sciences

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

2. STATISTICAL INFERENCE<br />

(II) Variance inequality:<br />

Var(T ) = E{Var(T |S)} + Var{E(T |S)}<br />

= E{Var(T |S)} + Var(T ∗ )<br />

≥<br />

Var(T ∗),<br />

since E{Var(T |S)} ≥ 0.<br />

Observe also that Var(T ) = Var(T ∗ )<br />

=⇒ E{Var(T |S)} = 0<br />

=⇒ Var(T |S) = 0 with prob. 1<br />

=⇒ T = E(T |S) with prob. 1.<br />

(III) T ∗ is an estimator:<br />

Since S is sufficient for θ,<br />

T ∗ =<br />

which does not depend on θ.<br />

∫ ∞<br />

−∞<br />

T (x)f(x|s)dx,<br />

Remarks<br />

(1) The Rao-Blackwell Theorem can be used occasionally to construct estimators.<br />

(2) The theoretical importance <strong>of</strong> this result is to observe that T ∗ will always<br />

depend on x only through S. If T is already a MVUE then T ∗ = T =⇒<br />

MVUE depends on x only through S.<br />

Example<br />

Suppose x 1 , . . . , x n ∼i.i.d. N(µ, σ 2 ), σ 2 known. We want to estimate µ. Take T = x 1 ,<br />

as an unbiased estimator for µ.<br />

n∑<br />

S = x i is a sufficient statistic for µ.<br />

i=1<br />

93

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!