Skip to content

Instantly share code, notes, and snippets.

@amohant4
Created June 6, 2019 00:29
Show Gist options
  • Save amohant4/92c2710b4e9c5b432c930b56c5b96558 to your computer and use it in GitHub Desktop.
Save amohant4/92c2710b4e9c5b432c930b56c5b96558 to your computer and use it in GitHub Desktop.
global_step = tf.Variable(0, trainable=False) # Variable to store number of iterations
starter_learning_rate = 0.1 # Initial Learning rate
learning_rate = tf.train.exponential_decay(
starter_learning_rate, global_step, # Function applied by TF on the varible (same formula as shown above)
100000, 0.96, staircase=True) # make staircase=True to force an integer division and thus create a step decay
# Passing global_step to minimize() will increment it at each step.
learning_step = (
tf.train.GradientDescentOptimizer(learning_rate) # We create an instance of the optimizer with updated learning rate each time
.minimize(...my loss..., global_step=global_step) # global step (# iterations) is updated by the minimize function
)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment