Skip to content

Instantly share code, notes, and snippets.

@ahmadyan
Created November 12, 2018 04:18
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ahmadyan/6156f540f16de948ae307bcebddadaa4 to your computer and use it in GitHub Desktop.
Save ahmadyan/6156f540f16de948ae307bcebddadaa4 to your computer and use it in GitHub Desktop.
Auto-differentiation of Rosenbrock function using AutoGrad
# Build a function that also returns gradients using autograd.
rosenbrock_with_grad = autograd.value_and_grad(rosenbrock)
# Optimize using conjugate gradients from scipy
result = scipy.optimize.minimize(rosenbrock_with_grad, x0=np.array([0.0, 0.0]), jac=True, method='CG')
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment