Skip to content

Instantly share code, notes, and snippets.

@s-trooper
Created March 12, 2015 12:53
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save s-trooper/9677d94537ddaae4bb25 to your computer and use it in GitHub Desktop.
Save s-trooper/9677d94537ddaae4bb25 to your computer and use it in GitHub Desktop.
ml
X = inputs
y = output
m = length(y);
theta = initial_theta = 0;
sigmoid(z) = 1.0 ./ (1.0 + e.^-z);
h = sigmoid(X * theta);
% gradient descent, here we go find theta ...
alpha = 0.001;
num_iters = 1000000;
for iter = 1:num_iters
gradient = X' * (h - y) / m;
theta = theta - alpha * gradient;
end
% cost function to check the hypothesis ... should always decreasing
cost = (-y' * log(h) - (1 - y)' * log(1 - h)) / m;
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment