Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
learning rate decay in pytorch
# http://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html
def exp_lr_scheduler(optimizer, epoch, init_lr=0.001, lr_decay_epoch=7):
"""Decay learning rate by a factor of 0.1 every lr_decay_epoch epochs."""
lr = init_lr * (0.1**(epoch // lr_decay_epoch))
if epoch % lr_decay_epoch == 0:
print('LR is set to {}'.format(lr))
for param_group in optimizer.param_groups:
param_group['lr'] = lr
return optimizer
@kartikpaigwar

This comment has been minimized.

Copy link

kartikpaigwar commented Jul 27, 2018

Actually you should pass the current learning rate not the initial lr. forgive me if you passing the change lr always. Thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.