Skip to content

Instantly share code, notes, and snippets.

@j-min
Created June 25, 2017 14:07
Show Gist options
  • Save j-min/a07b235877a342a1b4f3461f45cf33b3 to your computer and use it in GitHub Desktop.
Save j-min/a07b235877a342a1b4f3461f45cf33b3 to your computer and use it in GitHub Desktop.
learning rate decay in pytorch
# http://pytorch.org/tutorials/beginner/transfer_learning_tutorial.html
def exp_lr_scheduler(optimizer, epoch, init_lr=0.001, lr_decay_epoch=7):
"""Decay learning rate by a factor of 0.1 every lr_decay_epoch epochs."""
lr = init_lr * (0.1**(epoch // lr_decay_epoch))
if epoch % lr_decay_epoch == 0:
print('LR is set to {}'.format(lr))
for param_group in optimizer.param_groups:
param_group['lr'] = lr
return optimizer
@kartikpaigwar
Copy link

Actually you should pass the current learning rate not the initial lr. forgive me if you passing the change lr always. Thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment