Skip to content

Instantly share code, notes, and snippets.

@SuvroBaner
Created December 31, 2019 08:51
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save SuvroBaner/8842142021df56631c0584b73a96b926 to your computer and use it in GitHub Desktop.
Save SuvroBaner/8842142021df56631c0584b73a96b926 to your computer and use it in GitHub Desktop.
def cost(logits, labels):
"""
    Computes the cost using the sigmoid cross entropy
    
    Arguments:
    logits -- vector containing z, output of the last linear unit (before the final sigmoid activation)
    labels -- vector of labels y (1 or 0)
"""
z = tf.placeholder(tf.float32, name = 'z')
y = tf.placeholder(tf.float32, name = 'y')
# the loss function -
cost = tf.nn.sigmoid_cross_entropy_with_logits(logits = z, labels = y)
# running the session -
with tf.Session() as sess:
cost = sess.run(cost, feed_dict = {z : logits, y : labels})
sess.close()
return cost
logits = np.array([0.29,0.34,0.75,0.88])
cost = cost(logits, np.array([0,0,1,1]))
print(cost)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment