Skip to content

Instantly share code, notes, and snippets.

@jeremyjordan
Created March 2, 2018 01:24
Show Gist options
  • Star 7 You must be signed in to star a gist
  • Fork 2 You must be signed in to fork a gist
  • Save jeremyjordan/86398d7c05c02396c24661baa4c88165 to your computer and use it in GitHub Desktop.
Save jeremyjordan/86398d7c05c02396c24661baa4c88165 to your computer and use it in GitHub Desktop.
Example implementation of LearningRateScheduler with a step decay schedule
import numpy as np
from keras.callbacks import LearningRateScheduler
def step_decay_schedule(initial_lr=1e-3, decay_factor=0.75, step_size=10):
'''
Wrapper function to create a LearningRateScheduler with step decay schedule.
'''
def schedule(epoch):
return initial_lr * (decay_factor ** np.floor(epoch/step_size))
return LearningRateScheduler(schedule)
lr_sched = step_decay_schedule(initial_lr=1e-4, decay_factor=0.75, step_size=2)
model.fit(X_train, Y_train, callbacks=[lr_sched])
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment