Skip to content

Instantly share code, notes, and snippets.

@khanhnamle1994
Created May 21, 2018 03:57
Show Gist options
  • Save khanhnamle1994/5c584620baa222e84a8ca7178892c2f1 to your computer and use it in GitHub Desktop.
Save khanhnamle1994/5c584620baa222e84a8ca7178892c2f1 to your computer and use it in GitHub Desktop.
Using cross entropy to calculate loss function
def calculate_loss(self, x, y):
assert len(x) == len(y)
output = Softmax()
layers = self.forward_propagation(x)
loss = 0.0
for i, layer in enumerate(layers):
loss += output.loss(layer.mulv, y[i])
return loss / float(len(y))
def calculate_total_loss(self, X, Y):
loss = 0.0
for i in range(len(Y)):
loss += self.calculate_loss(X[i], Y[i])
return loss / float(len(Y))
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment