22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

computation.

If you chose "Local Installation" in the "Setup Guide" and skipped

or had issues with Step 5 ("Install GraphViz software and

TorchViz package"), you will get an error when trying to visualize

the graphs using make_dot.

So, let’s stick with the bare minimum: two (gradient-computing) tensors for our

parameters, predictions, errors, and loss—these are Steps 0, 1, and 2.

# Step 0 - Initializes parameters "b" and "w" randomly

torch.manual_seed(42)

b = torch.randn(1, requires_grad=True, \

dtype=torch.float, device=device)

w = torch.randn(1, requires_grad=True, \

dtype=torch.float, device=device)

# Step 1 - Computes our model's predicted output - forward pass

yhat = b + w * x_train_tensor

# Step 2 - Computes the loss

error = (yhat - y_train_tensor)

loss = (error ** 2).mean()

# We can try plotting the graph for any variable: yhat, error, loss

make_dot(yhat)

Running the code above will produce the graph below:

Figure 1.5 - Computation graph generated for yhat; Obs.: the corresponding variable names were

inserted manually

Dynamic Computation Graph | 93

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!