Skip to content

Instantly share code, notes, and snippets.

@tswedish
Last active May 26, 2020 20:23
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save tswedish/637deb51ffbe2655469d715038b1fedd to your computer and use it in GitHub Desktop.
Save tswedish/637deb51ffbe2655469d715038b1fedd to your computer and use it in GitHub Desktop.
def fn(x):
return x**2 + 0.2*(x-2)**5 + 2*x**3
# initialization
x = Variable(3.)
target = 42.
print('---- Initial Value ----')
print('fn(x): {}'.format(fn(x)))
print('Target: {}'.format(target))
print('intial guess for x: {}'.format(x))
for n in range(20):
L = (fn(x) - target)**2
L.backward()
# gradient descent update
x.value = x.value - 1e-4 * x.gradient
# clear the gradients
x.clear_gradient()
print('---- Converged Value ----')
print('fn(x): {}'.format(fn(x)))
print('Target: {}'.format(target))
print('converged x: {}'.format(x))
'''
# Wolfram alpha minimum:
# min{(x^2 + 0.2 (x - 2)^5 + 2 x^3 - 42)^2} = 0 at x≈2.60158
# Output:
'''
---- Initial Value ----
fn(x): < Variable value: 63.2, gradient: 0.0 >
Target: 42.0
intial guess for x: < Variable value: 3.0, gradient: 0.0 >
---- Converged Value ----
fn(x): < Variable value: 42.00014691908245, gradient: 0.0 >
Target: 42.0
converged x: < Variable value: 2.601581019114941, gradient: 0.0 >
'''
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment