Skip to content

Instantly share code, notes, and snippets.

@anuvrat
Created May 26, 2012 23:09
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save anuvrat/2795575 to your computer and use it in GitHub Desktop.
Save anuvrat/2795575 to your computer and use it in GitHub Desktop.
Performs gradient descent to learn theta
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
% theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha
% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
theta = theta - (X' * (X * theta - y)) * (alpha / m);
% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);
end
end
@EmanDiab
Copy link

EmanDiab commented Jul 5, 2019

forgive me, i just was wondering about what X' * mean i am new to matlab and these symbol really confuse me as i was trying to figure out what it means

@TrangPhuongMai
Copy link

that matrix transpose

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment