Skip to content

Instantly share code, notes, and snippets.

@jeffdonahue
Last active March 24, 2017 04:24
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jeffdonahue/b0641bafc2df22d7bd46d953dd89ef0e to your computer and use it in GitHub Desktop.
Save jeffdonahue/b0641bafc2df22d7bd46d953dd89ef0e to your computer and use it in GitHub Desktop.
import numpy as np
import theano
import theano.tensor as T
pred, target = T.vectors('pt') # pred in (-inf, +inf), target in [0, 1]
L1 = T.nnet.binary_crossentropy(T.nnet.sigmoid(pred), target).sum()
L2 = T.nnet.sigmoid_binary_crossentropy(pred, target).sum()
fl1, fl2 = [theano.function([pred, target], L) for L in (L1, L2)]
g1, g2 = [theano.grad(L, [pred, target]) for L in (L1, L2)]
fg1, fg2 = [theano.function([pred, target], g) for g in (g1, g2)]
np.random.seed(0)
n = 3
px = np.random.randn(n).astype(theano.config.floatX)
lx = np.random.random(n).astype(theano.config.floatX)
print 'binary_crossentropy loss:', fl1(px, lx)
theano.printing.debugprint(fl1)
print '\nsigmoid_binary_crossentropy loss:', fl2(px, lx)
theano.printing.debugprint(fl2)
print '\nbinary_crossentropy grad:', fg1(px, lx)
theano.printing.debugprint(fg1)
print '\nsigmoid_binary_crossentropy grad:', fg2(px, lx)
theano.printing.debugprint(fg2)
@jeffdonahue
Copy link
Author

jeffdonahue commented Mar 23, 2017

Output:

binary_crossentropy loss: 2.69898871459
Elemwise{Neg}[(0, 0)] [id A] ''   2
 |Sum{acc_dtype=float64} [id B] ''   1
   |Elemwise{Composite{((i0 * i1 * scalar_softplus((-i2))) + (i3 * (i4 - i1) * scalar_softplus(i2)))}} [id C] ''   0
     |TensorConstant{(1,) of -1.0} [id D]
     |t [id E]
     |p [id F]
     |TensorConstant{(1,) of -1.0} [id D]
     |TensorConstant{(1,) of 1.0} [id G]

sigmoid_binary_crossentropy loss: 2.69898871459
Sum{acc_dtype=float64} [id A] ''   1
 |Elemwise{Composite{(scalar_softplus((-Abs(i0))) + (i0 * (GT(i0, i1) - i2)))}} [id B] ''   0
   |p [id C]
   |TensorConstant{(1,) of 0} [id D]
   |t [id E]

binary_crossentropy grad: [array([ 0.43006166, -0.04716868,  0.28927052]), array([-1.76405235, -0.40015721, -0.97873798])]
Elemwise{Composite{((i0 * i1 * (i2 - scalar_sigmoid(i3))) - (i0 * (i2 - i1) * scalar_sigmoid(i3)))}} [id A] ''   0
 |TensorConstant{(1,) of -1.0} [id B]
 |t [id C]
 |TensorConstant{(1,) of 1.0} [id D]
 |p [id E]
Elemwise{Composite{(scalar_softplus((-i0)) + (-scalar_softplus(i0)))}} [id F] ''   1
 |p [id E]

sigmoid_binary_crossentropy grad: [array([ 0.43006166, -0.04716868,  0.28927052]), array([-1.76405235, -0.40015721, -0.97873798])]
Elemwise{Composite{(scalar_sigmoid(i0) - i1)}} [id A] ''   0
 |p [id B]
 |t [id C]
Elemwise{neg,no_inplace} [id D] ''   1
 |p [id B]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment