28.01.2015 Views

PDF of Lecture Notes - School of Mathematical Sciences

PDF of Lecture Notes - School of Mathematical Sciences

PDF of Lecture Notes - School of Mathematical Sciences

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

2. STATISTICAL INFERENCE<br />

Remark<br />

If x 1 , x 2 , . . . , x n are independent, the log likelihood function can be written as:<br />

l(θ; x) =<br />

n∑<br />

log f i (x i ; θ).<br />

i=1<br />

If x 1 , x 2 , . . . , x n are i.i.d., we have:<br />

l(θ; x) =<br />

n∑<br />

log f(x i ; θ).<br />

i=1<br />

Definition. 2.2.2<br />

Consider a statistical problem with log-likelihood l(θ; x). The score is defined by:<br />

and the Fisher information is<br />

U(θ; x) = ∂l<br />

∂θ<br />

I(θ) = E<br />

( )<br />

− ∂2 l<br />

.<br />

∂θ 2<br />

Remark<br />

For a single observation with <strong>PDF</strong> f(x, θ), the information is<br />

)<br />

i(θ) = E<br />

(− ∂2<br />

∂θ log f(X, θ) .<br />

2<br />

In the case <strong>of</strong> x 1 , x 2 , . . . , x n i.i.d. , we have<br />

Theorem. 2.2.1<br />

Under suitable regularity conditions,<br />

I(θ) = ni(θ).<br />

E{U(θ; X)} = 0<br />

and Var{U(θ; X)} = I(θ).<br />

82

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!