Skip to content

Instantly share code, notes, and snippets.

@tswedish
Last active May 26, 2020 20:32
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save tswedish/33bc1738ffdba588436236d95efa53ed to your computer and use it in GitHub Desktop.
Save tswedish/33bc1738ffdba588436236d95efa53ed to your computer and use it in GitHub Desktop.
def backward(self, output_gradient=1.):
# combine gradients from other paths and propagate to children
self.gradient += output_gradient
local_gradient = self.derivative_op(*self.calc_input_values())
for differential, input_variable in zip(local_gradient, self.input_variables):
input_variable.backward(differential * output_gradient)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment