22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Making Predictions (Classes)

classes = (predictions >= 0).astype(np.int)

classes

Output

array([[1],

[1],

[1],

[0]])

Clearly, the points where the logits (z) equal zero determine the boundary

between positive and negative examples.

"Why 0.5? Can I choose a different threshold?"

Sure, you can! Different thresholds will give you different confusion matrices and,

therefore, different metrics, like accuracy, precision, and recall. We’ll get back to

that in the "Decision Boundary" section.

By the way, are you still holding that thought about the "glorified linear regression?"

Good!

Decision Boundary

We have just figured out that whenever z equals zero, we are in the decision

boundary. But z is given by a linear combination of features x 1 and x 2 . If we work

out some basic operations, we arrive at:

Equation 3.22 - Decision boundary for logistic regression with two features

Given our model (b, w 1 , and w 2 ), for any value of the first feature (x 1 ), we can

compute the corresponding value of the second feature (x 2 ) that sits exactly at the

236 | Chapter 3: A Simple Classification Problem

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!