Skip to content

Instantly share code, notes, and snippets.

@amankharwal
Created February 12, 2021 09:37
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save amankharwal/a76b437b56be07115c36fceccb6d5e7f to your computer and use it in GitHub Desktop.
Save amankharwal/a76b437b56be07115c36fceccb6d5e7f to your computer and use it in GitHub Desktop.
# Batch Gradient Descent
import numpy as np
eta = 0.1 # learning rate
n_iterations = 1000
m = 100
X = 2 * np.random.rand(100, 1)
y = 4 + 3 * X + np.random.randn(100, 1)
X_b = np.c_[np.ones((100, 1)), X] # add x0 = 1 to each instance
theta = np.random.randn(2,1) #random initilization
for iteration in range(n_iterations):
gradients = 2/m * X_b.T.dot(X_b.dot(theta)-y)
theta = theta - eta * gradients
print(theta)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment