Skip to content

Instantly share code, notes, and snippets.

@Mattamorphic
Last active October 1, 2019 16:24
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save Mattamorphic/e76dcd88574fcc453774a165915d87b0 to your computer and use it in GitHub Desktop.
Save Mattamorphic/e76dcd88574fcc453774a165915d87b0 to your computer and use it in GitHub Desktop.
Gradient Descent Example Python
# Given f(x) = x**4 - 3**3 + 2 = f1(x) = 4x**3 - 9**2
# lets start at x = 6
curr_x = 6
gamma = 0.001
precision = 0.0000001
step_size = 1
max_iterations = 1000
i = 0
df = lambda x: (4 * x**3) - (9**2)
while step_size > precision and i < max_iterations:
prev_x = curr_x
curr_x -= gamma * df(prev_x)
step_size = abs(curr_x - prev_x)
i+=1
print("The local minimum occurs at:", curr_x)
current // Initial guess at where the optimum lies
gamma // A step size determined by the user
precision // How precise the answer must be
step_size = 1 // The difference between the current and previous steps
max_iterations // How long are we willing to let the algorithm run
i = 0 // iteration counter
def df // Function that returns our derivative / partial derivatives
while step_size > precision and i < max_iterations:
previous = current // retain the old position for comparison
current = gamma * - df(previous) // the next step should be in the negative direction
step_size = abs(current - previous) // determine how far we have moved
i++
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment