Skip to content

Instantly share code, notes, and snippets.

@unixpickle
Created October 20, 2017 20:26
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save unixpickle/85064a1d79b46b5486fd896afabb5191 to your computer and use it in GitHub Desktop.
Save unixpickle/85064a1d79b46b5486fd896afabb5191 to your computer and use it in GitHub Desktop.
check gradient through softmax
import numpy as np
import tensorflow as tf
sess = tf.Session()
in_vec = tf.constant(np.array([1, 2, 3], dtype='float32'))
one_hot = tf.constant(np.array([0, 1, 0], dtype='float32'))
in_grad = tf.gradients(tf.nn.softmax_cross_entropy_with_logits(labels=one_hot, logits=in_vec), in_vec)[0]
print(sess.run(in_grad))
print(sess.run(tf.nn.softmax(in_vec) - one_hot))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment