22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

classification: The predicted class is the one with the largest logit.

If there is a single column of labels, that would be a binary classification: The

predicted class will be the positive class if the predicted probability is above a

given threshold (usually 0.5). But there’s a catch here: If the last layer of the model

is not a sigmoid, we need to apply a sigmoid to the logits first to get the

probabilities, and only then compare them with the threshold.

Then, for each possible class, the method figures out how many predictions match

the labels, and appends the result to a tensor. The shape of the resulting tensor will

be (number of classes, 2), the first column representing correct predictions, the

second, the number of data points.

Let’s try applying this new method to the first mini-batch of our data loader:

sbs_cnn1.correct(images_batch, labels_batch)

Output

tensor([[5, 7],

[3, 3],

[6, 6]])

So, there are only two wrong predictions, both for class #0 (parallel lines),

corresponding to images #6 and #8, as we’ve already seen in the previous section.

"What if I want to compute it for all mini-batches in a data loader?"

Loader Apply

On it! That’s the role of the static method loader_apply(): It applies a function to

each mini-batch, and stacks the results before applying a reducing function such as

sum or mean.

Visualizing Filters and More! | 409

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!