Skip to content

Instantly share code, notes, and snippets.

@adarsh1021
Created September 16, 2018 13:38
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save adarsh1021/bc11656edb9eb82224d450b72c4859ec to your computer and use it in GitHub Desktop.
Save adarsh1021/bc11656edb9eb82224d450b72c4859ec to your computer and use it in GitHub Desktop.
# Building the model
m = 0
c = 0
L = 0.0001 # The learning Rate
epochs = 1000 # The number of iterations to perform gradient descent
n = float(len(X)) # Number of elements in X
# Performing Gradient Descent
for i in range(epochs):
Y_pred = m*X + c # The current predicted value of Y
D_m = (-2/n) * sum(X * (Y - Y_pred)) # Derivative wrt m
D_c = (-2/n) * sum(Y - Y_pred) # Derivative wrt c
m = m - L * D_m # Update m
c = c - L * D_c # Update c
print (m, c)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment