06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

576 8 Nonparametric and Robust Inference<br />

For a step function with m regions having constant values, c1, . . ., cm, the<br />

likelihood is<br />

L(c1, . . ., cm; y1, . . ., yn) =<br />

=<br />

n<br />

p(yi)<br />

i=1<br />

m<br />

k=1<br />

c nk<br />

k , (8.30)<br />

where nk is the number <strong>of</strong> data points in the kth region. For the step function<br />

to be a bona fide estimator, all ck must be nonnegative and finite. A maximum<br />

therefore exists in the class <strong>of</strong> step functions that are bona fide estimators.<br />

If vk is the measure <strong>of</strong> the volume <strong>of</strong> the kth region (that is, vk is the<br />

length <strong>of</strong> an interval in the univariate case, the area in the bivariate case, and<br />

so on), we have<br />

m<br />

ckvk = 1.<br />

k=1<br />

We incorporate this constraint together with equation (8.30) to form the Lagrangian,<br />

<br />

m<br />

L(c1, . . ., cm) + λ 1 −<br />

<br />

.<br />

k=1<br />

ckvk<br />

Differentiating the Lagrangian function and setting the derivative to zero, we<br />

have at the maximum point ck = c∗ k , for any λ,<br />

∂L<br />

∂ck<br />

= λvk.<br />

Using the derivative <strong>of</strong> L from equation (8.30), we get<br />

nkL = λc ∗ k vk.<br />

Summing both sides <strong>of</strong> this equation over k, we have<br />

and then substituting, we have<br />

nL = λ,<br />

nkL = nLc ∗ kvk.<br />

Therefore, the maximum <strong>of</strong> the likelihood occurs at<br />

c ∗ k = nk<br />

.<br />

nvk<br />

The restricted maximum likelihood estimator is therefore<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!