Created
November 2, 2011 19:04
-
-
Save denzilc/1334559 to your computer and use it in GitHub Desktop.
Compute Cost Multiple Variables
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
function J = computeCostMulti(X, y, theta) | |
%COMPUTECOSTMULTI Compute cost for linear regression with multiple variables | |
% J = COMPUTECOSTMULTI(X, y, theta) computes the cost of using theta as the | |
% parameter for linear regression to fit the data points in X and y | |
% Initialize some useful values | |
m = length(y); % number of training examples | |
% You need to return the following variables correctly | |
J = 0; | |
% ====================== YOUR CODE HERE ====================== | |
% Instructions: Compute the cost of a particular choice of theta | |
% You should set J to the cost. | |
J = (0.5/m) * (X*theta - y)' * (X*theta - y); | |
% ========================================================================= | |
end |
@punkytun
They are the exact same thing.
.^2 is an element-wise square function, which is the same thing as multiplying a function by it's inverse when it's a vector matrix.
The logic:
Given matrix A with dimensions mx1, A*A cant work because mx1 * mx1 has the wrong dimensions.
But A'*A gives you 1xm * mx1 - which gives you a 1x1 matrix, where every single element in the vector matrix is multiplied by itself.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
J = sum((Xtheta - y).^2)/(2m);
Is your answer and this code the same ? Because i used the line below and i still got the mark for this exercise. Please explain more about this, thanks!