Skip to content

Instantly share code, notes, and snippets.

@randcode-generator
Created August 9, 2017 17:13
Show Gist options
  • Save randcode-generator/847d70bf1d9a3fb69c52e304c074266f to your computer and use it in GitHub Desktop.
Save randcode-generator/847d70bf1d9a3fb69c52e304c074266f to your computer and use it in GitHub Desktop.
import tensorflow as tf
#Output from a neural network
logits = [1.0,2.0,3.0,4.0,5.0,1.0,1.0]
#Expected output
expected = [1.0,0.0,0.0,0.0,0.0,1.0,1.0]
output1 = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=expected)
#This is equivalent to softmax_cross_entropy_with_logits
softmax = tf.nn.softmax(logits)
output2 = -tf.reduce_sum(expected * tf.log(softmax))
with tf.Session() as sess:
print(output1.eval())
print(output2.eval())
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment