Skip to content

Instantly share code, notes, and snippets.

@Ab1992ao
Last active June 2, 2021 22:10
Show Gist options
  • Save Ab1992ao/a5aa0f22128144236b585ab2899f8b3c to your computer and use it in GitHub Desktop.
Save Ab1992ao/a5aa0f22128144236b585ab2899f8b3c to your computer and use it in GitHub Desktop.
define softmax_loss
def softmax_loss(vectors):
anc, pos, neg = vectors
c = 0.5
anc = c * anc
pos = c * pos
neg = c * neg
pos_sim = tf.reduce_sum((anc * pos), axis=-1, keepdims=True)
neg_mul = tf.matmul(anc, neg, transpose_b=True)
neg_sim = tf.log(tf.reduce_sum(tf.exp(neg_mul), axis=-1, keepdims=True))
loss = tf.nn.relu(neg_sim - pos_sim)
return loss
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment