Skip to content

Instantly share code, notes, and snippets.

@lintangsutawika
Last active July 24, 2023 09:19
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save lintangsutawika/f2f3fb422d6d7df28bd74e26940da2e6 to your computer and use it in GitHub Desktop.
Save lintangsutawika/f2f3fb422d6d7df28bd74e26940da2e6 to your computer and use it in GitHub Desktop.
Simple PyTorch implementation of Concatenated ReLU
import torch
import torch.nn.functional as F
# Inspired from Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units
# https://arxiv.org/pdf/1603.05201.pdf
class CReLU(nn.Module):
def __init__(self, inplace=False):
super(CReLU, self).__init__()
def forward(self, x):
x = torch.cat((x,-x),-1)
return F.relu(x)
@knotgrass
Copy link

I think dim should be -1

@lintangsutawika
Copy link
Author

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment