Skip to content

Instantly share code, notes, and snippets.

@joschu
Last active December 28, 2015 04:39
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save joschu/7443684 to your computer and use it in GitHub Desktop.
Save joschu/7443684 to your computer and use it in GitHub Desktop.
import numpy as np, theano.tensor as TT, theano
x = TT.scalar('x')
y = TT.scalar('y')
z = TT.mod(x**2, y)
# z = x**2+y**2
f = theano.function([x,y], z, allow_input_downcast=True)
dfdx = theano.function([x,y], TT.grad(z,x),allow_input_downcast=True)
dfdy = theano.function([x,y], TT.grad(z,y),allow_input_downcast=True)
import numdifftools as ndt
for i in xrange(100):
print i
xtest = np.random.randn()
ytest = np.random.rand()
assert np.allclose( ndt.Derivative(lambda _x : f(_x,ytest),step_nom=1e-8)(xtest), dfdx(xtest,ytest),atol=1e-8)
assert np.allclose( ndt.Derivative(lambda _y : f(xtest, _y),step_nom=1e-8)(ytest), dfdy(xtest,ytest),atol=1e-8)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment