Skip to content

Instantly share code, notes, and snippets.

@wassname
Created February 26, 2018 08:38
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save wassname/c15a2b72df716f2fa1299661c1414e6b to your computer and use it in GitHub Desktop.
Save wassname/c15a2b72df716f2fa1299661c1414e6b to your computer and use it in GitHub Desktop.
Combing pytorches adam and scheduled learning rate into one (for when the model doesn't have a callback for the scheduler)
class AdamStepLR(torch.optim.Adam):
"""Combine Adam and lr_scheduler.StepLR so we can use it as a normal optimiser"""
def __init__(self, params, lr=0.001, betas=(0.9, 0.999), eps=1e-08, weight_decay=0, step_size=50000, gamma=0.5):
super().__init__(params, lr, betas, eps, weight_decay)
self.scheduler = torch.optim.lr_scheduler.StepLR(self, step_size, gamma)
def step(self):
self.scheduler.step()
return super().step()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment