Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save joelgrus/c52a4d6e5563ee70b8b2 to your computer and use it in GitHub Desktop.
Save joelgrus/c52a4d6e5563ee70b8b2 to your computer and use it in GitHub Desktop.
simple regression using batch gradient descent
def target_fn(theta):
"""want to minimize squared errors as a function of *theta*, so we hardcode in the data"""
alpha, beta = theta
return sum_of_squared_errors(alpha, beta, num_friends_good, daily_minutes_good)
def gradient_fn(theta):
"""similarly, need gradient as a function of *theta*, so hardcode in the data"""
alpha, beta = theta
result = [0, 0]
for x_i, y_i in zip(num_friends_good, daily_minutes_good):
e = error(alpha, beta, x_i, y_i)
result[0] += -2 * e
result[1] += -2 * e * x_i
return result
theta = [random.random(), random.random()]
alpha, beta = minimize_batch(target_fn, gradient_fn, theta, 1.0)
print "alpha", alpha
print "beta", beta
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment