Skip to content

Instantly share code, notes, and snippets.

@naure
Last active June 10, 2017 19:34
Show Gist options
  • Save naure/78bc7a881a9db17e366093c81425184f to your computer and use it in GitHub Desktop.
Save naure/78bc7a881a9db17e366093c81425184f to your computer and use it in GitHub Desktop.
SELU in Keras and Numpy
import keras
from keras import backend as K
def kr_selu(x, alpha=1.6732632423543772848170429916717, scale=1.0507009873554804934193349852946):
""" Scaled Exponential Linear Units
Magic values target activations of 0 mean and 1 variance.
See https://arxiv.org/abs/1706.02515
"""
return scale * K.elu(x, alpha)
# Make the activation available as "selu"
keras.activations.selu = kr_selu
alpha = 1.6732632423543772848170429916717
scale = 1.0507009873554804934193349852946
def selu(x, alpha=1.6732632423543772848170429916717, scale=1.0507009873554804934193349852946):
return scale * np.where(x>=0.0, x, alpha*np.exp(x)-alpha)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment