Skip to content

Instantly share code, notes, and snippets.

@charusat09
Last active December 25, 2018 17:07
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save charusat09/597c2eea6c5f0d753c60f1cd6bbe65bc to your computer and use it in GitHub Desktop.
Save charusat09/597c2eea6c5f0d753c60f1cd6bbe65bc to your computer and use it in GitHub Desktop.
Predict House Price based on Area in feet, Locality, No of Bed Rooms. I have created this demo project while completing my ML course
850,3,3,4500000
1000,4,4,3500000
585,3,3,3100000
801,4,4,5500000
729,4,3,4200000
776,4,3,4500000
700,2,3,7100000
800,4,3,4400000
1500,3,5,10000000
1800,3,4,7500000
% This program runs with Octave 3.8+ software
clear ; close all; clc
data = load('ex1TrainingSet.txt');
X = data(:, 1:3);
y = data(:, 4);
m = length(y);
dd = [810 3 3]
ddd = [1 dd]
[X mu sigma] = featureNormalize(X);
X = [ones(m, 1) X];
alpha = 0.01;
num_iters = 400;
theta = zeros(4, 1);
[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);
figure;
plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);
xlabel('Number of iterations');
ylabel('Cost J');
d = dd;
d = (d - mu) ./ sigma;
d = [ones(1, 1) d];
price = d * theta;
fprintf(['Predicted price of a %d sq-ft, %d locality and %d bed room house ' ...
'(using gradient descent):\n $%f\n'], d(1), d(2), d(3), price);
data = csvread('ex1TrainingSet.txt');
X = data(:, 1:3);
y = data(:, 4);
m = length(y);
X = [ones(m, 1) X];
theta = normalEqn(X, y);
d = ddd;
price = d * theta; % You should change this
fprintf(['Predicted price of a %d sq-ft, %d locality and %d bed room house ' ...
'(using gradient descent):\n $%f\n'], d(2), d(3), d(4), price);
function [X_norm, mu, sigma] = featureNormalize(X)
X_norm = X;
mu = zeros(1, size(X, 2));
sigma = zeros(1, size(X, 2));
mu = mean(X_norm);
sigma = std(X_norm);
tf_mu = X_norm - repmat(mu,length(X_norm),1);
tf_std = repmat(sigma,length(X_norm),1);
X_norm = tf_mu ./ tf_std;
end
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
m = length(y);
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
delta = (1/m)*sum(X.*repmat((X*theta - y), 1, size(X,2)));
theta = (theta' - (alpha * delta))';
J_history(iter) = computeCostMulti(X, y, theta);
end
end
function [theta] = normalEqn(X, y)
theta = zeros(size(X, 2), 1);
theta = pinv(X'*X)*X'*y;
end
function J = computeCostMulti(X, y, theta)
m = length(y); % number of training examples
J = 0;
J = (1/(2*m))*sum(power((X*theta - y),2));
end
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment