Create a gist now

Instantly share code, notes, and snippets.

@dwf /
Created Apr 10, 2017

What would you like to do?
Stable binary cross-entropy, operating on logit predictions instead of relying on Theano to correctly optimize the sigmoid output.
from theano import tensor
def binary_crossentropy_from_logits(logits, targets):
"""Binary cross-entropy computed from model logits.
predictions : TensorVariable
The unnormalized log probabilities of a probabilistic binary
targets : TensorVariable
The targets for the classifier in [0, 1].
The log probability of each prediction under the dirac
density specified by the corresponding target.
a, t = logits, targets
return t * tensor.nnet.softplus(-a) + (1 - t) * tensor.nnet.softplus(a)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment