22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Notebook Cell 1.10 - PyTorch’s model in action: no more manual prediction / forward step!

1 # Sets learning rate - this is "eta" ~ the "n"-like

2 # Greek letter

3 lr = 0.1

4

5 # Step 0 - Initializes parameters "b" and "w" randomly

6 torch.manual_seed(42)

7 # Now we can create a model and send it at once to the device

8 model = ManualLinearRegression().to(device) 1

9

10 # Defines an SGD optimizer to update the parameters

11 # (now retrieved directly from the model)

12 optimizer = optim.SGD(model.parameters(), lr=lr)

13

14 # Defines an MSE loss function

15 loss_fn = nn.MSELoss(reduction='mean')

16

17 # Defines number of epochs

18 n_epochs = 1000

19

20 for epoch in range(n_epochs):

21 model.train() # What is this?!? 2

22

23 # Step 1 - Computes model's predicted output - forward pass

24 # No more manual prediction!

25 yhat = model(x_train_tensor) 3

26

27 # Step 2 - Computes the loss

28 loss = loss_fn(yhat, y_train_tensor)

29

30 # Step 3 - Computes gradients for both "b" and "w" parameters

31 loss.backward()

32

33 # Step 4 - Updates parameters using gradients and

34 # the learning rate

35 optimizer.step()

36 optimizer.zero_grad()

37

38 # We can also inspect its parameters using its state_dict

39 print(model.state_dict())

108 | Chapter 1: A Simple Regression Problem

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!