01.06.2015 Views

Actuarial Modelling of Claim Counts Risk Classification, Credibility ...

Actuarial Modelling of Claim Counts Risk Classification, Credibility ...

Actuarial Modelling of Claim Counts Risk Classification, Credibility ...

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Mixed Poisson Models for <strong>Claim</strong> Numbers 35<br />

1.5 Statistical Inference for Discrete Distributions<br />

1.5.1 Maximum Likelihood Estimators<br />

Maximum likelihood is a method <strong>of</strong> estimation and inference for parametric models. The<br />

maximum likelihood estimator is the value <strong>of</strong> the parameter (or parameter vector) that makes<br />

the observed data most likely to have occurred given the data generating process assumed<br />

to have produced the variable <strong>of</strong> interest.<br />

The likelihood <strong>of</strong> a sample <strong>of</strong> observations is defined as the joint density <strong>of</strong> the data, with<br />

the parameters taken as variable and the data as fixed (multiplied by any arbitrary constant<br />

or function <strong>of</strong> the data but not <strong>of</strong> the parameters). Specifically, let N 1 N 2 N n be a set<br />

<strong>of</strong> independent and identically distributed outcomes with probability mass function p·<br />

where is a vector <strong>of</strong> parameters. The likelihood function is the probability <strong>of</strong> observing<br />

the data N 1 = k 1 N n = k n , that is,<br />

=<br />

n∏<br />

pk i <br />

i=1<br />

The key idea for estimation in likelihood problems is that the most reasonable estimate is<br />

the value <strong>of</strong> the parameter vector that would make the observed data most likely to occur.<br />

The implicit assumption is <strong>of</strong> course that the data at hand are reliable. More formally we<br />

seek a value <strong>of</strong> that maximizes . The maximum likelihood estimator <strong>of</strong> is the<br />

random variable ̂ for which the likelihood is maximum, that is<br />

̂ ≥ for all <br />

It is usually simpler mathematically to find the maximum <strong>of</strong> the logarithm <strong>of</strong> the likelihood<br />

L = ln =<br />

n∑<br />

ln pk i <br />

rather than the likelihood itself. The function L is usually referred to as the log-likelihood.<br />

Because the logarithm is a monotonic transformation, the log-likelihood will be maximized<br />

at the same parameter value that maximizes the likelihood (although the shape <strong>of</strong> the loglikelihood<br />

is different from that <strong>of</strong> the likelihood).<br />

When working with counting variables, it is <strong>of</strong>ten easier to use the observed frequencies<br />

i=1<br />

f k = #observations equal to k k = 0 1 2 (1.46)<br />

In other words, f k is the number <strong>of</strong> times that the value k has been observed in the sample.<br />

Denoting the largest observation as<br />

the log-likelihood becomes<br />

k max = max k i<br />

i=1n<br />

k∑<br />

max<br />

L = f k ln pk<br />

k=0

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!