28.01.2015 Views

PDF of Lecture Notes - School of Mathematical Sciences

PDF of Lecture Notes - School of Mathematical Sciences

PDF of Lecture Notes - School of Mathematical Sciences

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

2. STATISTICAL INFERENCE<br />

=⇒ MSE T (θ) = Var(T ) =<br />

θ(1 − θ)<br />

.<br />

n<br />

Remark<br />

This example shows that MSE T (θ) must be thought <strong>of</strong> as a function <strong>of</strong> θ rather than<br />

just a number.<br />

E.g., Figure 18<br />

Figure 18: MSE T (θ) as a function <strong>of</strong> θ<br />

Intuitively, a good estimator is one for which MSE is as small as possible. However,<br />

quantifying this idea is complicated, because MSE is a function <strong>of</strong> θ, not just a number.<br />

See Figure 19.<br />

For this reason, it turns out it is not possible to construct a minimum MSE estimator<br />

in general.<br />

To see why, suppose T ∗ is a minimum MSE estimator for θ. Now consider the estimator<br />

T a = a, where a ∈ R is arbitrary. Then for a = θ, T θ = θ with MSE Tθ (θ) = 0.<br />

Observe MSE Ta (a) = 0; hence if T ∗ exists, then we must have MSE T ∗(a) = 0. As a<br />

is arbitrary, we must have MSE T ∗(θ) = 0 ∀θ, ∈ Θ<br />

=⇒ T ∗ = θ with probability 1.<br />

Therefore we conclude that (excluding trivial situations) no minimum MSE estimator<br />

can exist.<br />

Definition. 2.1.5 The bias <strong>of</strong> the estimator T is defined by:<br />

b T (θ) = E(T ) − θ.<br />

79

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!