06.06.2013 Views

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

Theory of Statistics - George Mason University

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

498 6 Statistical Inference Based on Likelihood<br />

where x is observable, θ is unknown and unobservable, g is a function <strong>of</strong> known<br />

form, but f and h are <strong>of</strong> unknown form. The estimation problem has two<br />

components: the estimation <strong>of</strong> parameter θ and the nonparametric estimation<br />

<strong>of</strong> the function h.<br />

In the setup <strong>of</strong> equation (6.67) when f(x; θ) is the PDF <strong>of</strong> the observable,<br />

we form a partial likelihood function based on g(x; θ). This partial likelihood<br />

is an likelihood in the sense that it is a constant multiple (wrt θ) <strong>of</strong> the full<br />

likelihood function. The parameter θ can be estimated using the ordinary<br />

method for MLE.<br />

The most common example <strong>of</strong> this kind <strong>of</strong> problem in statistical inference<br />

is estimation <strong>of</strong> the proportional hazards model. Rather than discuss partial<br />

likelihood further here, I will postpone consideration <strong>of</strong> this semiparametric<br />

problem to Section 8.4 beginning on page 572.<br />

Notes and Further Reading<br />

Most <strong>of</strong> the material in this chapter is covered in MS2 Section 4.4, Section<br />

4.5,and Section 5.4, and in TPE2 Chapter 6.<br />

Likelihood and Probability<br />

Although it is natural to think <strong>of</strong> the distribution that yields the largest<br />

lieklihood as the “most probable” distribution that gave rise to an observed<br />

sample, it is important not to think <strong>of</strong> the likelihood function, even if it could<br />

be properly normalized, as a probability density. In the likelihood approach to<br />

statistical inference, there is no posterior conditional probability distribution<br />

as there is in a Bayesian approach. The book by Edwards (1992) provides a<br />

good discussion <strong>of</strong> the fundamental concept <strong>of</strong> likelihood.<br />

EM Methods<br />

EM methods were first discussed systematically by Dempster et al. (1977). A<br />

general reference for EM methods is Ng et al. (2012).<br />

Computations<br />

The R functionfitdistr in theMASS library computes the MLEs for a number<br />

<strong>of</strong> common distributions.<br />

Multiple RLEs<br />

There are interesting open questions associated with determining if an RLE<br />

yields a global maximum. See, for example, Biernacki (2005).<br />

<strong>Theory</strong> <strong>of</strong> <strong>Statistics</strong> c○2000–2013 James E. Gentle

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!