Skip to content

Instantly share code, notes, and snippets.

@marcopeix
Created February 26, 2019 18:08
Show Gist options
  • Save marcopeix/2c6ab12c3dd705b903468f399c0b083b to your computer and use it in GitHub Desktop.
Save marcopeix/2c6ab12c3dd705b903468f399c0b083b to your computer and use it in GitHub Desktop.
def linear_activation_backward(dA, cache, activation):
linear_cache, activation_cache = cache
if activation == "relu":
dZ = relu_backward(dA, activation_cache)
dA_prev, dW, db = linear_backward(dZ, linear_cache)
elif activation == "sigmoid":
dZ = sigmoid_backward(dA, activation_cache)
dA_prev, dW, db = linear_backward(dZ, linear_cache)
return dA_prev, dW, db
@ByronKruger
Copy link

ByronKruger commented Apr 22, 2019

implementation of sigmoid_backward() and relu_backward() missing

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment