14.06.2012 Views

Rating Models and Validation - Oesterreichische Nationalbank

Rating Models and Validation - Oesterreichische Nationalbank

Rating Models and Validation - Oesterreichische Nationalbank

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Chart 65: Interpretation of the Bayesian Error Rate as the Lowest Overall Error where p ¼ 50%<br />

Entropy-Based Measures of Discriminatory Power<br />

These measures of discriminatory power assess the information gained by using<br />

the rating model. In this context, information is defined as a value which is<br />

measurable in absolute terms <strong>and</strong> which equals the level of knowledge about<br />

a future event.<br />

Let us first assume that the average default probability of all cases in the segment<br />

in question is unknown. If we look at an individual case in this scenario<br />

without being able to estimate its credit quality using rating models or other<br />

assumptions, we do not possess any information about the (good or bad) future<br />

default status of the case. The maximum in this scenario is the information an<br />

observer gains by waiting for the future status.<br />

If, however, the average probability of a credit default is known, the information<br />

gained by actually observing the future status of the case is lower in this<br />

scenario due to the previously available information.<br />

These considerations lead to the definition of the Òinformation entropyÓ<br />

value, which is represented as follows for dichotomous events with a probability<br />

of occurrence p for the Ò1Ó event (in this case the credit default):<br />

H0 ¼ fp log 2 ðpÞþð1 pÞ log 2 ð1 pÞg<br />

H0 refers to the absolute information value which is required in order to determine<br />

the future default status, or conversely the information value which is<br />

gained by observing the Òcredit default/no credit defaultÓ event. Thus entropy<br />

can also be interpreted as a measure of uncertainty as to the outcome of an<br />

event.<br />

H0 reaches its maximum value of 1 when p ¼ 50%, that is, when default <strong>and</strong><br />

non-default are equally probable. H0 equals zero when p takes the value 0 or 1,<br />

that is, the future default status is already known with certainty in advance.<br />

Conditional entropy is defined with conditional probabilities pð jcÞ instead of<br />

absolute probabilities p; the conditional probabilities are based on condition c.<br />

<strong>Rating</strong> <strong>Models</strong> <strong>and</strong> <strong>Validation</strong><br />

Guidelines on Credit Risk Management 113

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!