22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Output

array([[0.5504593 ],

[0.94999564],

[0.9757515 ],

[0.22519748]], dtype=float32)

Now we’re talking! These are the probabilities, given our model, of those four

points being positive examples.

Lastly, we need to go from probabilities to classes. If the probability is greater than

or equal to a threshold, it is a positive example. If it is less than the threshold, it is a

negative example. Simple enough. The trivial choice of a threshold is 0.5:

Equation 3.19 - From probabilities to classes

But the probability itself is just the sigmoid function applied to the logit (z):

Equation 3.20 - From logits to classes, via sigmoid function

But the sigmoid function has a value of 0.5 only when the logit (z) has a value of

zero:

Equation 3.21 - From logits to classes, directly

Thus, if we don’t care about the probabilities, we could use the predictions (logits)

directly to get the predicted classes for the data points:

Model Training | 235

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!