Skip to content

Instantly share code, notes, and snippets.

@achrafsoltani
Created April 18, 2022 20:28
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save achrafsoltani/4daaf2afeff5818f6f6a468bdd755e76 to your computer and use it in GitHub Desktop.
Save achrafsoltani/4daaf2afeff5818f6f6a468bdd755e76 to your computer and use it in GitHub Desktop.
function parameters = gradient_descent(X, Y, theta0, theta1, learning_rate, initial_cost)
i = 0;
previous_cost = initial_cost+1;
cost = initial_cost;
m = length(X);
while(i < 100)
H = theta0 + theta1.*X;
cost = compute_cost(H, Y, m);
format short
derivatives = compute_derivatives(X, Y, theta0, theta1, m);
theta0 = theta0 - learning_rate*derivatives(1);
theta1 = theta1 - learning_rate*derivatives(2);
i = i+1;
previous_cost = cost;
end
parameters = [theta0(1), theta1, cost];
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment