06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

390 5 Unbiased Point Estimation<br />

UMVU is closely related to complete sufficiency, which means that it probably<br />

has nice properties (like being able to be identified easily) in exponential<br />

families. One <strong>of</strong> the most useful facts is the Lehmann-Scheffé theorem.<br />

Theorem 5.1 (Lehmann-Scheffé Theorem)<br />

Let T be a complete sufficient statistic for θ, and suppose T has finite second<br />

moment. If g(θ) is U-estimable, then there is a unique UMVUE <strong>of</strong> g(θ) <strong>of</strong> the<br />

form h(T), where h is a Borel function.<br />

The first part <strong>of</strong> this is just a corollary to the Rao-Blackwell theorem,<br />

Theorem 3.8. The uniqueness comes from the completeness, and <strong>of</strong> course,<br />

means unique a.e.<br />

The Lehmann-Scheffé theorem may immediately identify a UMVUE.<br />

Example 5.5 UMVUE <strong>of</strong> Bernoulli parameter<br />

Consider the Bernoulli family <strong>of</strong> distributions with parameter π. Suppose<br />

we take a random sample X1, . . ., Xn. Now the Bernoulli (or in this case,<br />

the binomial(n, π)) is a complete one-parameter exponential family, and T =<br />

n<br />

i=1 Xi is a complete sufficient statistic for π with expectation nπ. By the<br />

Lehmann-Scheffé theorem, therefore, the unique UMVUE <strong>of</strong> π is<br />

W =<br />

n<br />

Xi/n. (5.5)<br />

i=1<br />

In Example 3.17, page 265, we showed that the variance <strong>of</strong> W achieves the<br />

CRLB; hence it must be UMVUE.<br />

The random sample from a Bernoulli distribution is the same as a single<br />

binomial observation, and W is an unbiased estimator <strong>of</strong> π, as in Example<br />

5.1. We also saw in that example that a constrained random sample from<br />

a Bernoulli distribution is the same as a single negative binomial observation<br />

N, and an unbiased estimator <strong>of</strong> π in that case is (t − 1)/(N − 1), where t is<br />

the required number <strong>of</strong> 1’s in the constrained random sample. This estimator<br />

is also UMVU for π (Exercise 5.1).<br />

<br />

In the usual definition <strong>of</strong> this family, π ∈ Π =]0, 1[. Notice that if<br />

n<br />

i=1 Xi = 0 or if n i=1 Xi = n, W /∈ Π. Hence, the UMVUE may not be valid<br />

in the sense <strong>of</strong> being a legitimate parameter for the probability distribution.<br />

Useful ways for checking that an estimator is UMVU are based on the<br />

following theorem and corollary.<br />

Theorem 5.2<br />

Let P = {Pθ}. Let T be unbiased for g(θ) and have finite second moment.<br />

Then T is a UMVUE for g(θ) iff E(TU) = 0 ∀θ ∈ Θ and ∀U ∋ E(U) =<br />

0, E(U 2 ) < ∞ ∀θ ∈ Θ.<br />

Pro<strong>of</strong>.<br />

First consider “only if”.<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!