Skip to content

Instantly share code, notes, and snippets.

@arseniyturin
Last active February 2, 2020 16:09
Show Gist options
  • Save arseniyturin/b7061201c35a6b482ce4e1456a73e6fe to your computer and use it in GitHub Desktop.
Save arseniyturin/b7061201c35a6b482ce4e1456a73e6fe to your computer and use it in GitHub Desktop.
Stochastic Gradient Descent
def SGD(X, y, lr=0.05, epoch=10, batch_size=1):
'''
Stochastic Gradient Descent for a single feature
'''
m, b = 0.5, 0.5 # initial parameters
log, mse = [], [] # lists to store learning process
for _ in range(epoch):
indexes = np.random.randint(0, len(X), batch_size) # random sample
Xs = np.take(X, indexes)
ys = np.take(y, indexes)
N = len(Xs)
f = ys - (m*Xs + b)
# Updating parameters m and b
m -= lr * (-2 * Xs.dot(f).sum() / N)
b -= lr * (-2 * f.sum() / N)
log.append((m, b))
mse.append(mean_squared_error(y, m*X+b))
return m, b, log, mse
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment