Skip to content

Instantly share code, notes, and snippets.

@MovsisyanM
Created May 18, 2022 12:07
Show Gist options
  • Save MovsisyanM/98f8d10828f3cd15792f8789845c5d0a to your computer and use it in GitHub Desktop.
Save MovsisyanM/98f8d10828f3cd15792f8789845c5d0a to your computer and use it in GitHub Desktop.
Capped ReLU activation function
def capped_relu(x):
"""Mirrored-Z shaped activation function: min(10, relu(x))"""
return tf.minimum(tf.keras.activations.relu(x), 10)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment