22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

# That's the typical mini-batch inner loop

for x_batch, y_batch in data_loader:

x_batch = x_batch.to(self.device)

y_batch = y_batch.to(self.device)

# Step 1

yhat = self.model(x_batch)

# Step 2

loss = self.loss_fn(yhat, y_batch)

# Step 3

loss.backward()

# Here we keep track of the losses (smoothed)

# and the learning rates

tracking['lr'].append(scheduler.get_last_lr()[0])

if iteration == 0:

tracking['loss'].append(loss.item())

else:

prev_loss = tracking['loss'][-1]

smoothed_loss = (alpha * loss.item() +

(1-alpha) * prev_loss)

tracking['loss'].append(smoothed_loss)

iteration += 1

# Number of iterations reached

if iteration == num_iter:

break

# Step 4

self.optimizer.step()

scheduler.step()

self.optimizer.zero_grad()

# Restores the original states

self.optimizer.load_state_dict(previous_states['optimizer'])

self.model.load_state_dict(previous_states['model'])

if ax is None:

fig, ax = plt.subplots(1, 1, figsize=(6, 4))

else:

fig = ax.get_figure()

ax.plot(tracking['lr'], tracking['loss'])

if step_mode == 'exp':

ax.set_xscale('log')

450 | Chapter 6: Rock, Paper, Scissors

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!