22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Finally, to illustrate the effect of the skip connection on an image, I’ve passed one of

the images from the Rock Paper Scissors dataset through a randomly initialized

residual block (three channels in and out, no downsampling), with and without a

skip connection. These are the results.

Figure 7.11 - Skip connection in action

On the one hand (that’s a good pun, c’mon!), if there are no skip connections, some

information may be lost, like the different shades on the back of the hand. On the

other hand (sorry!), skip connections may help to preserve that information.

That’s the general idea behind the ResNet model. Of course, the whole

architecture is more complex than that, involving stacking many different residual

blocks and adding some more bells and whistles to the mix. We’re not going into

any more details here, but the pre-trained models can easily be used for transfer

learning, just like we did with the AlexNet model.

Putting It All Together

In this chapter, we’ve gone through the necessary steps to use transfer learning

with pre-trained models for computer vision tasks: using ImageNet statistics for

pre-processing the inputs, freezing layers (or not), replacing the "top" layer, and

optionally speeding up training by generating features and training the "top" of

the model independently.

554 | Chapter 7: Transfer Learning

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!