Skip to content

Instantly share code, notes, and snippets.

@sameerg07
Created June 7, 2021 09:57
Show Gist options
  • Save sameerg07/39ea6f0e8ab8f9949a479793d18593a6 to your computer and use it in GitHub Desktop.
Save sameerg07/39ea6f0e8ab8f9949a479793d18593a6 to your computer and use it in GitHub Desktop.
Manual Gradient descent for given X and y with starting m and c values for an equation y= mX + c
# Takes in X, y, current m and c (both initialised to 0), num_iterations, learning rate
# returns gradient at current m and c for each pair of m and c
def gradient(X, y, m_current=0, c_current=0, iters=1000, learning_rate=0.01):
N = float(len(y))
gd_df = pd.DataFrame( columns = ['m_current', 'c_current','cost'])
for i in range(iters):
y_pred = (m_current * X) + c_current
cost = sum([data**2 for data in (y-y_pred)]) / N
m_gradient = -(2/N) * sum(X * (y - y_pred))
c_gradient = -(2/N) * sum(y - y_pred)
m_current = m_current - (learning_rate * m_gradient)
c_current = c_current - (learning_rate * c_gradient)
gd_df.loc[i] = [m_current,c_current,cost]
return(gd_df)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment