22.02.2024 Views

Daniel Voigt Godoy - Deep Learning with PyTorch Step-by-Step A Beginner’s Guide-leanpub

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Schedulers in StepByStep — Part I

If we want to incorporate learning rate schedulers into our training loop, we need

to make some changes to our StepByStep class. Since schedulers are definitely

optional, we need to add a method to allow the user to set a scheduler (similar to

what we did with TensorBoard integration). Moreover, we need to define some

attributes: one for the scheduler itself, and a boolean variable to distinguish

whether it is an epoch or a mini-batch scheduler.

StepByStep Method

setattr(StepByStep, 'scheduler', None)

setattr(StepByStep, 'is_batch_lr_scheduler', False)

def set_lr_scheduler(self, scheduler):

# Makes sure the scheduler in the argument is assigned to the

# optimizer we're using in this class

if scheduler.optimizer == self.optimizer:

self.scheduler = scheduler

if (isinstance(scheduler, optim.lr_scheduler.CyclicLR) or

isinstance(scheduler, optim.lr_scheduler.OneCycleLR) or

isinstance(scheduler,

optim.lr_scheduler.CosineAnnealingWarmRestarts)):

self.is_batch_lr_scheduler = True

else:

self.is_batch_lr_scheduler = False

setattr(StepByStep, 'set_lr_scheduler', set_lr_scheduler)

482 | Chapter 6: Rock, Paper, Scissors

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!