22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

model = nn.Sequential()

model.add_module('hidden', nn.Linear(2, 10))

model.add_module('activation', nn.ReLU())

model.add_module('output', nn.Linear(10, 1))

model.add_module('sigmoid', nn.Sigmoid())

loss_fn = nn.BCELoss()

The model above increases dimensionality from two dimensions (two

features) to ten dimensions and then uses those ten dimensions to compute

logits. But it only works if there is an activation function between the

layers.

I suppose you may have two questions right now: "Why is that?" and "What

actually is an activation function?" Fair enough. But these are topics for the

next chapter.

Classification Threshold

This section is optional. In it, I will dive deeper into using different

thresholds for classification and how this affects the confusion

matrix. I will explain the most common classification metrics: true

and false positive rates, precision and recall, and accuracy.

Finally, I will show you how these metrics can be combined to

build ROC and Precision-Recall curves.

If you are already comfortable with these concepts, feel free to

skip this section.

So far, we’ve been using the trivial threshold of 50% to classify our data points,

given the probabilities predicted by our model. Let’s dive a bit deeper into this and

see the effects of choosing different thresholds. We’ll be working on the data

points in the validation set. There are only 20 data points in it, so we can easily

keep track of all of them.

Classification Threshold | 241

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!