Skip to content

Instantly share code, notes, and snippets.

@lanius
Created January 25, 2016 14:04
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save lanius/bc70d601359ccbfbe45d to your computer and use it in GitHub Desktop.
Save lanius/bc70d601359ccbfbe45d to your computer and use it in GitHub Desktop.
Multivariable Newton's method with autograd
import autograd.numpy as np
import autograd
def newton(f, x0, tol=1.48e-08, maxiter=50):
g = autograd.grad(f)
h = autograd.hessian(f)
x = x0
for _ in range(maxiter):
delta = np.linalg.solve(h(x), -g(x))
x = x + delta
if np.linalg.norm(delta) < tol:
break
return x
def main():
def f(x):
return (x[0] ** 3) + (x[1] ** 3) - (9 * x[0] * x[1]) + 27
x0 = np.array([3., 2.])
print(newton(f, x0))
if __name__ == '__main__':
main()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment