Skip to content

Instantly share code, notes, and snippets.

@RafayAK
Last active June 20, 2019 09:02
Show Gist options
  • Save RafayAK/d2f37758f4903d1739e2fc2047642d39 to your computer and use it in GitHub Desktop.
Save RafayAK/d2f37758f4903d1739e2fc2047642d39 to your computer and use it in GitHub Desktop.
This helper function computes the squared error cost and its derivative
def compute_cost(Y, Y_hat):
"""
This function computes and returns the Cost and its derivative.
The is function uses the Squared Error Cost function -> (1/2m)*sum(Y - Y_hat)^.2
Args:
Y: labels of data
Y_hat: Predictions(activations) from a last layer, the output layer
Returns:
cost: The Squared Error Cost result
dY_hat: gradient of Cost w.r.t the Y_hat
"""
m = Y.shape[1]
cost = (1 / (2 * m)) * np.sum(np.square(Y - Y_hat))
cost = np.squeeze(cost) # remove extraneous dimensions to give just a scalar
dY_hat = -1 / m * (Y - Y_hat) # derivative of the squared error cost function
return cost, dY_hat
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment