Skip to content

Instantly share code, notes, and snippets.

@tomrunia
Created November 3, 2016 14:04
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save tomrunia/43f3f10192aadcc3c0a10efb98464809 to your computer and use it in GitHub Desktop.
Save tomrunia/43f3f10192aadcc3c0a10efb98464809 to your computer and use it in GitHub Desktop.
# We must calculate the mean of each gradient. Note that this is the
# synchronization point across all towers.
grads_and_vars = average_gradients(t_grads)
# Optionally perform gradient clipping
if config.max_norm_gradient > 0:
grads, variables = zip(*grads_and_vars)
grads_clipped, _ = tf.clip_by_global_norm(grads, clip_norm=config.max_norm_gradient)
grads_and_vars = zip(grads_clipped, variables)
# Apply the gradients to adjust the shared variables.
apply_gradient_op = optimizer.apply_gradients(grads_and_vars, global_step=global_step)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment