Skip to content

Instantly share code, notes, and snippets.

@dradecic
Created October 13, 2019 07:58
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
Star You must be signed in to star a gist
Save dradecic/cb1a3b0a68f8b8e0307dba754de08113 to your computer and use it in GitHub Desktop.
linreg_gradient_descent
b0, b1 = 0.0, 1.0
lr = 0.001
epochs = 10000
error = []
# run 10000 times
for epoch in range(epochs):
# initialize to 0 -> cost of epoch, Jb_0, Jb_1
epoch_cost, cost_b0, cost_b1 = 0, 0, 0
for i in range(len(x)):
# make prediction
y_pred = (b0 + b1*x[i])
# append squared error
epoch_cost += (y[i] - y_pred)**2
for j in range(len(x)):
# partial derivative of b0 and b1 for current row
partial_wrt_b0 = -2 * (y[j] - (b0 + b1*x[j]))
partial_wrt_b1 = (-2 * x[j]) * (y[j] - (b0 + b1*x[j]))
# increase cost of coeffs
cost_b0 += partial_wrt_b0
cost_b1 += partial_wrt_b1
# calculate new coeffs
b0 = b0 - lr * cost_b0
b1 = b1 - lr * cost_b1
# keep track of errors - for visualization purposes
error.append(epoch_cost)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment