22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Model Size Classifier Layer(s) Replacement Layer(s)

ResNet 224 model.fc nn.Linear(512,num_classes)

DenseNet 224 model.classifier nn.Linear(1024,num_classes)

SqueezeNet 224 model.classifier[1] nn.Conv2d(512,num_classes,

kernel_size=1,stride=1)

"Why there are two layers for the InceptionV3 model?"

The Inception model is a special case because it has auxiliary classifiers. We’ll

discuss them later in this chapter.

Model Configuration

What’s missing in the model configuration? A loss function and an optimizer. A

multiclass classification problem, when the model produces logits, requires

nn.CrossEntropyLoss() as the loss function. For the optimizer, let’s use Adam with

the "Karpathy Constant" (3e-4) as its learning rate.

Model Configuration

1 torch.manual_seed(17)

2 multi_loss_fn = nn.CrossEntropyLoss(reduction='mean')

3 optimizer_alex = optim.Adam(alex.parameters(), lr=3e-4)

Cool, the model configuration is taken care of, so we can turn our attention to the…

Data Preparation

This step is quite similar to what we did in the previous chapter (we’re still using

the Rock Paper Scissors dataset), except for one key difference: We will use

different parameters for standardizing the images.

"So we’re not computing statistics for the images in our dataset

anymore?"

Nope!

"Why not?" Transfer Learning in Practice | 513

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!