Skip to content

Instantly share code, notes, and snippets.

@digantamisra98
Last active August 12, 2019 00:27
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save digantamisra98/eef3a1504c8c0a34b3957f58d91383d0 to your computer and use it in GitHub Desktop.
Save digantamisra98/eef3a1504c8c0a34b3957f58d91383d0 to your computer and use it in GitHub Desktop.
'''
Applies the mish function element-wise:
mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + exp(x)))
'''
# import pytorch
from torch import nn
# import activation functions
import functional as Func
class Mish(nn.Module):
'''
Applies the mish function element-wise:
mish(x) = x * tanh(softplus(x)) = x * tanh(ln(1 + exp(x)))
Shape:
- Input: (N, *) where * means, any number of additional
dimensions
- Output: (N, *), same shape as the input
Examples:
>>> m = Mish()
>>> input = torch.randn(2)
>>> output = m(input)
'''
def __init__(self):
'''
Init method.
'''
super().__init__()
def forward(self, input):
'''
Forward pass of the function.
'''
return Func.mish(input)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment