Skip to content

Instantly share code, notes, and snippets.

@RaphaelMeudec
Created July 18, 2019 15:12
Show Gist options
  • Save RaphaelMeudec/6a59ffcd9ece0490001c10727fca7d13 to your computer and use it in GitHub Desktop.
Save RaphaelMeudec/6a59ffcd9ece0490001c10727fca7d13 to your computer and use it in GitHub Desktop.
Guided Backpropagation at inference time with Tensorflow 2
with tf.GradientTape() as tape:
conv_outputs, predictions = grad_model(np.array([img]))
loss = predictions[:, CAT_CLASS_INDEX]
output = conv_outputs[0]
grads = tape.gradient(loss, conv_outputs)[0]
# Apply guided backpropagation
gate_f = tf.cast(output > 0, 'float32')
gate_r = tf.cast(grads > 0, 'float32')
guided_grads = tf.cast(output > 0, 'float32') * tf.cast(grads > 0, 'float32') * grads
@Student-HXJ
Copy link

Hi, is this conv_outputs means the last layers?
if so, how to hook the first layer for gradient

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment