Skip to content

Instantly share code, notes, and snippets.

@krishnanraman
Created November 28, 2020 01:48
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save krishnanraman/e592986b4ccbedfc115fb6193bde573b to your computer and use it in GitHub Desktop.
Save krishnanraman/e592986b4ccbedfc115fb6193bde573b to your computer and use it in GitHub Desktop.
gradient descent for regression
# Goal: Find unknown scalar w to minimize L(w)
# L(w) = sum((y[i] - w*x[i])^2)
# (x[i], y[i]), i=1..n dataset for linear regression
#
# Repeat Iterative Procedure below until convergence:
# w[i+1] = w[i] - alpha * gradient(L(w), w=w[i])
#
set.seed(12345)
x=seq(-5,5,0.5)
y = 2*x + rnorm(length(x),0,1)
c = -sum(x*y)
d = sum(x*x)
w0 = 123
alpha = 0.0001
eps = 1e-04
grad = function(w) { return(c+d*w) }
l = function(w) { return(-0.5* sum((y - w*x)^2)) }
wiplus1 = function(wi,alpha) { return(wi - alpha*grad(wi))}
wi = w0
wprev=w0
i=0
lik = l(wi)
cat(sprintf("Start Gradient Descent! Steps: %d\tWeight: %f\tError:%f\n",i,wi,lik))
while(TRUE) {
i = i+1
wprev = wi
wi= wiplus1(wi,alpha)
if (abs(wprev - wi) < eps) break
lik = l(wi)
#cat(sprintf("In Gradient Descent! Steps: %d\tWeight: %f\tError:%f\n",i,wi,lik))
}
cat(sprintf("Finished Gradient Descent! Steps: %d\tWeight: %f\tError:%f\n",i,wi,lik))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment