22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

thresholds do not necessarily include the extremes: zero and one. In Scikit-

Learn’s PR curve, the right-most point is clearly different than ours.

"How come the PR curve dips to lower precision? Shouldn’t it always

go up as we raise the threshold, moving to the left along the curve?"

The Precision Quirk

Glad you asked! This is very annoying and somewhat counterintuitive, but it

happens often, so let’s take a closer look at it. To illustrate why this happens, I will

plot the probability lines for three distinct thresholds: 0.4, 0.5, and 0.57.

Figure 3.19 - The precision quirk

At the top, with a threshold of 0.4, we have 15 points on the right (classified as

positive), two of which are false positives. The precision is given by:

But if we move the threshold to the right, up to 0.5, we lose one true positive,

effectively reducing precision:

This is a temporary side effect, though. As we raise the threshold even further to

0.57, we get the benefit of getting rid of a false positive, thus increasing precision:

Classification Threshold | 255

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!