Skip to content

Instantly share code, notes, and snippets.

@egonSchiele
Created March 9, 2016 23:48
Show Gist options
  • Save egonSchiele/07a41ebb7654e62f6498 to your computer and use it in GitHub Desktop.
Save egonSchiele/07a41ebb7654e62f6498 to your computer and use it in GitHub Desktop.
Linear regression with fminunc
x = [1000, 2000, 4000];
y = [200000, 250000, 300000];
% given a theta_0 and theta_1, this function calculates
% their cost. We don't need this function, strictly speaking...
% but it is nice to print out the costs as gradient descent iterates.
% We should see the cost go down every time the values of theta get updated.
function distance = cost(theta)
theta_0 = theta(1);
theta_1 = theta(2);
x = [1000, 2000, 4000];
y = [200000, 250000, 300000];
distance = 0;
for i = 1:length(x) % arrays in octave are indexed starting at 1
square_feet = x(i);
predicted_value = theta_0 + theta_1 * square_feet;
actual_value = y(i);
% how far off was the predicted value (make sure you get the absolute value)?
distance = distance + abs(actual_value - predicted_value);
end
% get how far off we were on average
distance = distance / length(x);
end
theta = [0, 0];
opt_theta = fminunc(@cost, theta);
% plot the data and the best-fit line
figure;
set (0,'defaultaxesposition', [0.15, 0.1, 0.7, 0.7]);
% function to calculate the predicted value
function result = h(x, t0, t1)
result = t0 + t1 * x;
end
plot(x, y, 'rx');
hold on;
plot(1000:4000, h(1000:4000, theta_0, theta_1));
% tell me how much to sell a 3000 square foot home for:
disp(h(3000, theta_0, theta_1));
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment