22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Extra Chapter

Vanishing and Exploding Gradients

Spoilers

In this chapter, we will:

• tackle the vanishing gradients problem using initialization schemes

• understand the effect of batch normalization on the vanishing gradients

• tackle the exploding gradients problem using gradient clipping

• clip gradients in different ways—element-wise, using its norm, and using hooks

• understand the difference between clipping gradients after or during

backpropagation

Jupyter Notebook

The Jupyter notebook corresponding to Chapter Extra [131] is part of the official

Deep Learning with PyTorch Step-by-Step repository on GitHub. You can also run it

directly in Google Colab [132] .

If you’re using a local installation, open your terminal or Anaconda prompt and

navigate to the PyTorchStepByStep folder you cloned from GitHub. Then, activate

the pytorchbook environment and run jupyter notebook:

$ conda activate pytorchbook

(pytorchbook)$ jupyter notebook

If you’re using Jupyter’s default settings, this link should open Chapter Extra’s

notebook. If not, just click on ChapterExtra.ipynb in your Jupyter’s home page.

Imports

For the sake of organization, all libraries needed throughout the code used in any

given chapter are imported at its very beginning. For this chapter, we’ll need the

following imports:

Spoilers | 561

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!