Skip to content

Instantly share code, notes, and snippets.

@shamdasani shamdasani/ Secret
Created Aug 5, 2017

What would you like to do?
def backward(self, X, y, o):
# backward propgate through the network
self.o_error = y - o # error in output
self.o_delta = self.o_error*self.sigmoidPrime(o) # applying derivative of sigmoid to error
self.z2_error = # z2 error: how much our hidden layer weights contributed to output error
self.z2_delta = self.z2_error*self.sigmoidPrime(self.z2) # applying derivative of sigmoid to z2 error
self.W1 += # adjusting first set (input --> hidden) weights
self.W2 += # adjusting second set (hidden --> output) weights
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.