Skip to content

Instantly share code, notes, and snippets.

@wut0n9
Created January 9, 2019 11:24
Show Gist options
  • Save wut0n9/b0194e2cd6ff347d2f9c60e59c1fe9cf to your computer and use it in GitHub Desktop.
Save wut0n9/b0194e2cd6ff347d2f9c60e59c1fe9cf to your computer and use it in GitHub Desktop.
# 梯度裁剪
# https://github.com/tensorflow/models/blob/56cbd1f2770f1e7386db43af37f6f11b4d85e3da/tutorials/rnn/ptb/ptb_word_lm.py#L159-L165
tvars = tf.trainable_variables()
grads, _ = tf.clip_by_global_norm(tf.gradients(self._cost, tvars),
config.max_grad_norm)
optimizer = tf.train.GradientDescentOptimizer(self._lr)
self._train_op = optimizer.apply_gradients(
zip(grads, tvars),
global_step=tf.train.get_or_create_global_step())
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment