22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

• freezing the layers of the model

• replacing the "top" layer of the model

• understanding the difference between fine-tuning and feature extraction

• using ImageNet statistics to pre-process the images

• generating a dataset of features using the frozen model

• training an independent model and attaching it to the original model

• understanding the role of auxiliary classifiers in very deep architectures

• building a loss function that handles auxiliary classifiers too

• training the "top" layer of an Inception V3 model

• using 1x1 convolutions as a dimension-reduction layer

• building an Inception module

• understanding what a batch normalization layer does

• discussing where to place the batch normalization layer, before or after an

activation function

• understanding the impact of mini-batch size on batch normalization statistics

• understanding the regularizer effect of batch normalization

• observing the effect of train and eval modes in batch normalization layers

• understanding what a residual / skip connection is

• understanding the effect of skip connections on the loss surface

• building a residual block

• fine-tuning and extracting features using a ResNet18 model

Congratulations! You have just finished the fourth and final chapter of Part II (not

counting the "Extra" chapter)! You are now familiar with the most important tools

and techniques for handling computer vision problems. Although there will always

be a lot to learn, since the field is very dynamic and new techniques are being

constantly developed, I believe that having a good grasp of how these building

blocks work will make it easier for you to further explore and keep learning on

your own.

In the next part, we’ll shift our focus to sequences and a whole new class of models:

Recurrent neural networks and their variants.

Recap | 559

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!