22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

In the next few pages, I will present four chunks of code showing

different attempts at creating parameters.

The first three attempts are shown to build up to a solution. The

first one only works well if you never use a GPU. The second one

doesn’t work at all. The third one works, but it is too verbose.

The recommended way of creating parameters is the last:

Notebook Cell 1.4.

The first chunk of code below creates two tensors for our parameters, including

gradients and all. But they are CPU tensors, by default.

# FIRST

# Initializes parameters "b" and "w" randomly, ALMOST as we

# did in Numpy, since we want to apply gradient descent on

# these parameters we need to set REQUIRES_GRAD = TRUE

torch.manual_seed(42)

b = torch.randn(1, requires_grad=True, dtype=torch.float)

w = torch.randn(1, requires_grad=True, dtype=torch.float)

print(b, w)

Output

tensor([0.3367], requires_grad=True)

tensor([0.1288], requires_grad=True)

Never forget to set the seed to ensure reproducibility, just like

we did before while using Numpy. PyTorch’s equivalent is

torch.manual_seed().

"If I use the same seed in PyTorch as I used in Numpy (or, to put it

differently, if I use 42 everywhere), will I get the same numbers?"

Unfortunately, NO.

You’ll get the same numbers for the same seed in the same package. PyTorch

generates a number sequence that is different from the one generated by Numpy,

even if you use the same seed in both.

82 | Chapter 1: A Simple Regression Problem

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!