22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

print(error.requires_grad, yhat.requires_grad, \

b.requires_grad, w.requires_grad)

print(y_train_tensor.requires_grad, x_train_tensor.requires_grad)

Output

True True True True

False False

grad

What about the actual values of the gradients? We can inspect them by looking at

the grad attribute of a tensor.

print(b.grad, w.grad)

Output

tensor([-3.3881], device='cuda:0')

tensor([-1.9439], device='cuda:0')

If you check the method’s documentation, it clearly states that gradients are

accumulated. What does that mean? It means that, if we run Notebook Cell 1.5's

code (Steps 1 to 3) twice and check the grad attribute afterward, we will end up

with:

Output

tensor([-6.7762], device='cuda:0')

tensor([-3.8878], device='cuda:0')

If you do not have a GPU, your outputs are going to be slightly different:

Output

tensor([-3.1125]) tensor([-1.8156])

Autograd | 87

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!