22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

piece of code that’s going to be used repeatedly into its own function: the minibatch

inner loop!

The inner loop depends on three elements:

• the device where data is being sent

• a data loader to draw mini-batches from

• a step function, returning the corresponding loss

Taking these elements as inputs and using them to perform the inner loop, we’ll

end up with a function like this:

Helper Function #2

1 def mini_batch(device, data_loader, step_fn):

2 mini_batch_losses = []

3 for x_batch, y_batch in data_loader:

4 x_batch = x_batch.to(device)

5 y_batch = y_batch.to(device)

6

7 mini_batch_loss = step_fn(x_batch, y_batch)

8 mini_batch_losses.append(mini_batch_loss)

9

10 loss = np.mean(mini_batch_losses)

11 return loss

In the last section, we realized that we were executing five times more updates

(the train_step_fn() function) per epoch due to the mini-batch inner loop. Before,

1,000 epochs meant 1,000 updates. Now, we only need 200 epochs to perform the

same 1,000 updates.

What does our training loop look like now? It’s very lean!

Run - Data Preparation V1, Model Configuration V1

%run -i data_preparation/v1.py

%run -i model_configuration/v1.py

DataLoader | 143

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!