22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

the true label.

As you’ve probably noticed, I used the functional version of the loss in the snippet of

code above: F.nll_loss(). But, as we’ve done with the binary cross-entropy loss in

Chapter 3, we’re likely to use the module version: nn.NLLLoss().

Just like before, this loss function is a higher-order function, and this one takes

three optional arguments (the others are deprecated, and you can safely ignore

them):

• reduction: It takes either mean, sum, or none. The default, mean, corresponds to

Equation 5.10. As expected, sum will return the sum of the errors, instead of the

average. The last option, none, corresponds to the unreduced form; that is, it

returns the full array of errors.

• weight: It takes a tensor of length C; that is, containing as many weights as

there are classes.

IMPORTANT: This argument can be used to handle imbalanced

datasets, unlike the weight argument in the binary cross-entropy

loss we’ve seen in Chapter 3.

Also, unlike the pos_weight argument of

nn.BCEWithLogitsLoss(), the nn.NLLLoss() computes a true

weighted average when this argument is used.

• ignore_index: It takes one integer, which corresponds to the one (and only

one) class index that should be ignored when computing the loss. It can be

used to mask a particular label that is not relevant to the classification task.

Let’s go through some quick examples using the arguments above. First, we need to

generate some dummy logits (we’ll keep using three classes, though) and the

corresponding log probabilities:

torch.manual_seed(11)

dummy_logits = torch.randn((5, 3))

dummy_labels = torch.tensor([0, 0, 1, 2, 1])

dummy_log_probs = F.log_softmax(dummy_logits, dim=-1)

dummy_log_probs

380 | Chapter 5: Convolutions

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!