06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

398 5 Unbiased Point Estimation<br />

The Bhattacharyya Lower Bound<br />

We now consider a simple case in which θ is a scalar (and, hence the estimand<br />

g(θ) and the estimator T(X) are scalars).<br />

For the PDF f(x; θ) and the Borel scalar function g(θ) assume that each<br />

is differentiable r times, and write<br />

and<br />

f (r) = ∂r f(x; θ)<br />

∂ r<br />

g (r) = ∂rg(θ) ∂r .<br />

Let T be an unbiased estimator <strong>of</strong> g(θ).<br />

Now, form the function<br />

Ds = T − g(θ) −<br />

s<br />

arf (r) /f, (5.19)<br />

r=1<br />

where the ar are constants to be determined. Now, we have<br />

E(f (r) /f) = 0 (5.20)<br />

as before, and since T be an unbiased estimator for g(θ), we have<br />

The variance <strong>of</strong> Ds is therefore,<br />

E(Ds) = 0.<br />

E(D 2 <br />

s) =<br />

<br />

T − g(θ) −<br />

s<br />

arf (r) /f<br />

r=1<br />

2<br />

fdx. (5.21)<br />

We now seek to minimize this quantity in the ar. To do so, for p = 1, . . ., s,<br />

we differentiate and set equal to zero:<br />

<br />

s<br />

T − g(θ) − arf (r) <br />

/f (f (p) /f)fdx = 0, (5.22)<br />

which yields<br />

r=1<br />

<br />

(T − g(θ))f (p) dx =<br />

s<br />

r=1<br />

ar<br />

f (r)<br />

f<br />

f (p)<br />

f<br />

fdx. (5.23)<br />

Because <strong>of</strong> (5.20), the left-hand side <strong>of</strong> (5.23) is<br />

<br />

Tf (p) dx = g (p) (θ). (5.24)<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!