Skip to content

Instantly share code, notes, and snippets.

@digantamisra98
Created August 12, 2019 12:58
Show Gist options
  • Star 6 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save digantamisra98/35ca0ec94ebefb99af6f444922fa52cd to your computer and use it in GitHub Desktop.
Save digantamisra98/35ca0ec94ebefb99af6f444922fa52cd to your computer and use it in GitHub Desktop.
Mish Class Definition in Keras
# Keras Implementation of Mish Activation Function.
# Import Necessary Modules.
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from keras.engine.base_layer import Layer
from keras import backend as K
class Mish(Layer):
'''
Mish Activation Function.
.. math::
mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + e^{x}))
Shape:
- Input: Arbitrary. Use the keyword argument `input_shape`
(tuple of integers, does not include the samples axis)
when using this layer as the first layer in a model.
- Output: Same shape as the input.
Examples:
>>> X_input = Input(input_shape)
>>> X = Mish()(X_input)
'''
def __init__(self, **kwargs):
super(Mish, self).__init__(**kwargs)
self.supports_masking = True
def call(self, inputs):
return inputs * K.tanh(K.softplus(inputs))
def get_config(self):
base_config = super(Mish, self).get_config()
return dict(list(base_config.items()) + list(config.items()))
def compute_output_shape(self, input_shape):
return input_shape
@digantamisra98
Copy link
Author

@rupshali You can define mish as a function instead rather than as a Layer and use it within any keras layers supporting activations with the function name.
For example:
Defining Mish as a function -

## Mish Activation Function
def mish(x):
	return tf.keras.layers.Lambda(lambda x: x*tf.tanh(tf.log(1+tf.exp(x))))(x)

Defining a network with Mish activations:

##LeNet Architecture
model = Sequential()
model.add(Conv2D(20, 5, padding="same",input_shape=inputShape, activation = mish))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))

model.add(Conv2D(50, 5, padding="same",activation = mish ))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))

model.add(Flatten())
model.add(Dense(500, activation = mish))


model.add(Dense(numClasses))

model.add(Activation("softmax"))
model.summary()

Hope this helps.

@rupshali
Copy link

rupshali commented Oct 6, 2020

Thanks a lot @digantamisra98

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment