22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

plot), they are. This serves us very well since we’re looking for a symmetrical

function that maps logit values into probabilities.

"Why does it need to be symmetrical?"

If the function weren’t symmetrical, different choices for the positive class would

produce models that were not equivalent. But, using a symmetrical function, we

could train two equivalent models using the same dataset, just flipping the classes:

• Blue Model (the positive class (y=1) corresponds to blue points)

◦ Data Point #1: P(y=1) = P(blue) = .83 (which is the same as P(red) = .17)

• Red Model (the positive class (y=1) corresponds to red points)

◦ Data Point #1: P(y=1) = P(red) = .17 (which is the same as P(blue) = .83)

Log Odds Ratio

By taking the logarithm of the odds ratio, the function is not only symmetrical, but

also maps probabilities into real numbers, instead of only the positive ones:

Equation 3.6 - Log odds ratio

In code, our log_odds_ratio() function looks like this:

def log_odds_ratio(prob):

return np.log(odds_ratio(prob))

p = .75

q = 1 - p

log_odds_ratio(p), log_odds_ratio(q)

Output

(1.0986122886681098, -1.0986122886681098)

214 | Chapter 3: A Simple Classification Problem

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!