Skip to content

Instantly share code, notes, and snippets.

@kssd
Last active July 3, 2019 19:34
Show Gist options
  • Save kssd/c7acdf4e67e6bdaa8cb168bcd362e636 to your computer and use it in GitHub Desktop.
Save kssd/c7acdf4e67e6bdaa8cb168bcd362e636 to your computer and use it in GitHub Desktop.
Python n-dim Residual sum of squares.

Gradient descent - Computing r_squared (residual sum of squares):

import numpy as np
from scipy.optimize import curve_fit

x = np.arange(1, 11, 1)
y = x + 10;

def lin_mod(x, a , b): return x*a + b

(a, b), cov = curve_fit(lin_mod, x, y)

y_mean = numpy.mean(y)
s_tot = sum((y - y_mean)**2)
s_res = sum((y-f(x, a, b))**2) 

rss = 1 - (s_res / s_tot)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment