22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

Schedulers in StepByStep — Part II

We need to make some more changes to handle mini-batch schedulers. Similar to

"Part I" above, we need to create a protected method that handles the step()

method of this group of schedulers.

StepByStep Method

def _mini_batch_schedulers(self, frac_epoch):

if self.scheduler:

if self.is_batch_lr_scheduler:

if isinstance(self.scheduler,

torch.optim.lr_scheduler.CosineAnnealingWarmRestarts):

self.scheduler.step(self.total_epochs + frac_epoch)

else:

self.scheduler.step()

current_lr = list(

map(lambda d: d['lr'],

self.scheduler.optimizer.state_dict()\

['param_groups'])

)

self.learning_rates.append(current_lr)

setattr(StepByStep, '_mini_batch_schedulers',

_mini_batch_schedulers)

Learning Rates | 487

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!